The Exigent Duality
Brain Frame Rate Cap - 10:28 CST, 9/27/20 (Sniper)
I've finally figured out why I can't tell the difference between thirty and sixty frames per second in pretty much any modern game.

Some years ago, I wrote a blog post explaining how my brain has a difficult time processing all of the visual information being thrown at it in contemporary titles. I can't find the post now, so let me just re-illustrate the principle.

Take a look at this gameplay footage of the upcoming "Ratchet and Clank" game. It has the following attributes:

  • Absurd texture resolution
  • Crazy rich lighting and shading
  • Shadows and ambient occlusion everywhere
  • Particle effects for practically everything
  • High densitory of plants and other static elements
  • Non-playable characters running all over the place
  • Ray-traced reflections on the floors

The game looks incredible, but to be honest I couldn't even tell what was going on half of the time, especially when there was an enemy running in the midst of several NPCs-- much less be able to identify the framerate: are you kidding?

The game could have been at twenty four, like a television show, thirty, sixty, or one hundred and twenty, and I was so overwhelmed by it I wouldn't have any clue, even if you put videos at different framerates side-by-side, without slow-motioning them.

This also explains why in older games, I sometimes can tell the difference: for example, I played that re-released "Dragon's Dogma" for Windows some years ago on my then-GTX 1070-- and with such blurry textures and simple geometry, sixty frames per second was observably smoother to my eyes: it's all because in that case, my brain wasn't overmatched by processing all of the other aspects of the imagery.

To put all of this another way, my mind is the bottleneck: it can only do so many tera-synapses per second, and with modern graphics, it's pretty much a hard thirty frames per second cap!