Starting with the shift from the GTX 1000 series to the RTX 2000 series of video cards, some asshole at Nvidia decided the next big thing was trying to do ray tracing in real time, like the big boy render farms at Pixar. Reflections! Better shadows, or something! There were two problems with this, and both of them have never been solved.
The first is that ray tracing is significantly more work for the video card, and tanks game performance. The second is that ray-traced reflections are barely noticeable in how they differ from existing reflections, and ray-traced lighting and shadows simply never look much better at all. So ray tracing at its best makes games look ever so slightly better, in a way that's not noticeable if you are actually playing the game, and the price you pay for that is taking the game from "smooth and responsive" to "waist deep in a revolutionary new mixture of mollasses and quicksand."
Last night a friend linked me this execrable advertisement wearing the flayed skin of journalism on PC Gamer, where a soulless marketing drone from Nvidia goes on about how rendering at full resolution is a lie, and the only truth is rendering small and using a fancy filter to make it bigger. PC Gamer's "hardware writer" uncritically parrots this and periodically editorializes about how correct and good Nvidia is, and you should probably go buy an RTX 4090 right now. Can you afford not to?
Nvidia bet big on ray tracing because it was a new feature their older cards didn't have. They needed to do something because there is basically no reason even a money-is-no-object turbogamer needs to upgrade their video card more than once every 3 years at the earliest now.
The jump from PS2/Xbox to PS3/360 console generation was the last big noticeable eye candy thing, and ever since then games look about the same but gradually moreso. PC gaming and console gaming also slowly merged into the same industry, which means all the big graphically intensive releases aren't doing more than the current consoles are capable of. This means there's not much reason to upgrade a gaming PC more often than a new console model comes out, because nothing's gonna ask more of your PC than a console equivalent.
How does Nvidia survive in a world where there's no good reason to buy new video cards every year? At this point, mostly lies. There's no demand for medicine so it's snake oil time, baby!
When Nvidia bet big on the ray tracing gimmick, they quickly ran into the wall that even their 1000 dollar plus top-end cards can't get playable performance with the gimmick enabled. Now they're trying to cover that up by going all-in on upscaling. Upscaling is rendering the image smaller, which takes less effort, and then using fancy algorithmic guesswork to make it look acceptable at full size. Their gimmick makes games run so bad they need to use smoke and mirrors to compensate.
This asshole in the article is throwing a bunch of nonsense out to cover for this. He says that rendering at full resolution is "wasteful" and you have to be more "intelligent" than that. He also hammers really hard on a claim that ray tracing is so realistic that even if you're rendering small and faking it bigger, it's still more real, somehow.
"Raster is a bag of fakeness. We get to throw that out and start doing path tracing and actually get real shadows and real reflections. And the only way we do that is by synthesising a lot of pixels with AI."
This idiot conman uses "raster" to refer to full-size rendering, which isn't what that means at all, but that's a very "it's a magazine not a clip" nitpick, so I'll set it aside. He goes on to declare that ray traced shadows and reflections are real, so real that guessing at most of the image still ends up with a realer image.
(I will also briefly point out that the use of the term "AI" in tech right now is a universal red flag, because it refers to a bunch of different ways of using statistical probability models to try and string random data together so it deceives humans acceptably. Techbros use the term like it's magic and/or a sci-fi robot, and it's simply not.)
The article then expresses concern over how Nvidia's rivals in the GPU space don't have equivalents to ray tracing or the guesswork coverup tech. It's framed as a failing on their part, because the writer is a stooge, but that's a key tactic for Nvidia. They can't make the profits they want off of just making video cards that do video card things. If they want investor-pleasing infinite growth, they need to make up a problem and present their magical solution that nobody else can give you.
Existing lighting and reflection solutions are actually fake, and wrong, and only Nvidia's new gimmick is real. It runs real bad, but it's so real, so use our other magic solution to make it run okay, and buy a new card every year or so because it keeps improving! That's just the cost of realism.
You wouldn't want your videogames about shooting lasers at wizards in space to be fake, would you?