As I wrote about here, between limitations in human vision plus the quality of modern-day anti-aliasing and reconstruction techniques, increasing resolution further is pointless. "But how will electronics makers continue to sell new products?", I asked myself.
Perhaps this will wind up being the answer: ever-higher refresh rates? For me personally, in 90% of applications and unless pointed out to me, I can't tell the difference between 30 and 60 frames per second, let alone outrageous numbers like 120.
For the first time while playing "Control", I actually fired up a framerate counter-- and it was moving along at about 47 frames per second. But, I had vsync enabled, and it looked butter smooth to me: I'm not at all picky about these things. I grew up playing games which ran in the low twenties or even the teens, so 30 frames per second is plenty good.
Lots of people's eyes are more sensitive to fresh rate than mine however, so perhaps it makes sense for those individuals. Of course, this is also assuming that game studios are willing to radically slash visual quality to hit these obscene framerates, which I very much doubt.