The Exigent Duality
Follow the Evidence - 15:22 CST, 1/13/25 (Sniper)
There is very much a phenomenon of media figures getting too close to the very subjects they are supposed to be critically covering, to the point where their objectivity becomes corrupted by the special dispensations and privileges of being connected.

Take the first half-odd hour of this Digital Foundry CES coverage as a case-in-point: Alex Battaglia brags about his personal connections to all of these Nvidia figures, then-- by his own admission, unashamedly-- walks up, salivating, tongue wagging out of his mouth, hat in hand, to Nvidia's tech demos... an actual journalist would keep a neutral expression, cast a healthily suspicious gaze on proceedings, and ask the tough questions: but not Digital Foundry! Totally unadulterated fanboys and industry insiders.

Back in 2018, I too thought Nvidia's RTX technologies were cool. But then I've watched over the subsequent six years... where has this technology gotten us exactly? Same thing with Unreal Engine 5 since its release. Games-at-large have horrible performance, such as "Wukong's" 29 fps on the $2000 RTX 5090, are blurry plus full of ghosting, and are plain-and-simply using features such as DLSS as a crutch just to make their titles playable. Meanwhile, they flip on Unreal Engine 5's slow, half-baked features such as "Lumen" and "Nanite" without the foggiest clue how to optimize their actual game.

You can even see Digital Foundry's coverage proving my point, accidentally: the game they covered as being the most high-tech-- "it runs at a gazillion fps!"-- doesn't even use UE5's features; the developers rolled their own tech! Also, the Nvidia tech demo Alex goes on and on about-- a tech demo-- apparently uses 80 gigabytes of disk space. And don't even get me going on what's happened to game financial budgets during this Nvidia-Unreal Engine period.

The philosophy is to make the games skinny, and the GPU-plus-engine fat: the game developers will just tick a few boxes, and outsource the expertise to Nvidia and Epic. I think this "skinny game" approach has demonstrably turned out to be the wrong path. We should be having the expertise with the people writing the game software, and where the GPUs and-- if they are used at all-- off-the-shelf engines are simply basic building blocks with which the game programmers can go nuts. It should be "fat game, skinny GPU", or something along those lines, versus the other way around.