I'd take this one step further: I haven't run the formula exactly, but as a mathematical approximation you need to be sitting practically inches from a sixty inch display to see the difference between "4K" and "8K", especially when you factor in the quality of modern anti-aliasing and reconstructive techniques. Heck, native "4K" is already pointless with the advent of DLSS-- who the heck needs "8K"?
In other words, not only will "mid-gen" refreshes be unnecessary, but in terms of raw processing prowess we'll very soon hit the "GPU of forever" point, similar to how we hit the "video card of forever" 2d mark in the 90's, where RAMDAC speed as a concept became irrelevant almost overnight. The cool part is that we're not far away from this "GPU of forever" being able to fit in a handheld: the Nintendo Switch concept came five or ten years too early.
The only way companies like Nvidia, or television makers for that matter, will survive is by diversifying into new ventures. Eventually, the world will shift away from texture-mapped polygons altogether I'm sure, and replace it with augmented reality point clouds or something-- at which point the innovation race will be on again, with or without the current industry players, depending on how well those companies play their cards; will Nvidia be Nvidia again, or will it be 3dfx or Matrox?