Nvidia's "Ampere" blog post talks about peak 32-bit floating point performance as being at the 19.5 teraflop level, so I'm assuming the top-end "3000 line" of their incoming chipsets will be somewhere around that performance level. It could even persuade me avoid the new consoles: the nice part about a video card upgrade is that I could hand my RTX 2080 down to my wife, and her GTX 1080 down to my daughter.
Unless there is a major paradigm shift, like a move to holograms or something, the world is rapidly approaching "video card of forever" territory: once games can reliably hit 60 frames per second at 2160p, it's not apparent what the purpose of more power is.
I haven't done the math, but a human probably has to sit at a very uncomfortably close distance to a 50" television to see the difference between 4K and 8K-- so more pixels is pointless, unless you're using a 150" projector or something along those lines. Heck, with how good reconstructive and anti-aliasing techniques are these days, one can get by with internal rendering resolutions as low as 720p with something like DLSS, in the most extreme case.
More power will still be desirable for virtual reality head sets, where having dual-4K images at super high framerates would be ideal. But outside of a niche market, VR head sets are a dead-end: they've been around since the 1980s and have never caught on for a variety of reasons, a large one being that they induce extreme motion sickness in many people.
But back to the main point-- conceptually, something like the Nintendo Switch came just a little too early: a some juncture, even a cheap tablet chipset will be able to hit some minimum bar, where reconstructive techniques will let a Switch-like device produce visuals which look as good as native 4K. Even as early as 2021, I'm curious to see if Nintendo's "Switch Pro" becomes reality, and what the output of that will look like.