this post was submitted on 25 Jan 2025
171 points (93.8% liked)
PC Gaming
9572 readers
859 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So couple things:
The first RTX cards were the 20 series, which came out in 2018
There was a time when volumetric lighting was also optional
There was a time when GRAPHICS CARDS were optional.
The first game to require RTX was the Indiana Jones game, as did the Avatar game.
Shit moves on. Did you expect your 1060 card from 2016 to last indefinitely? How long did you expect developers to support 2 different lighting systems?
There is so much to be angry about these days, but not this. This was inevitable. If you MUST be angry about it, at least be angry at the right devs
I grew up gaming on a 386 when the great divide was CGA, EGA, VGA, and SVGA, that was the first big graphics card war for me and the beginning of the whole, Oceania had always been at war with Eastasia moment, and then the gulf of floppy disks and hard drives, and then CD-Roms and then the the 3d accelerators games. PC gaming has always had point of contention between the whales and have nots but gaming studios had a good track record always made sure to provide plenty of support to those gamers who weren't whales.
Forcing ray tracing when you have to have a 1000+ dollar video card to run ray tracing that doesn't tank your frame rate into completely unplayable territory then ray tracing is not ready for prime time yet and complete shit move on the developers part to mandate it when it's still untenable for most gamers.
Nope, this is just complete bullshit to me that's forcing gamers to have a sub par experience and force them into buying more expensive hardware for no other reason than to satisfy a publisher's ego to market their game "next gen".
Edit: How sad is it that ray tracing was first released in 2018 and we still don't have the hardware to run them on a rig with moderate hardware.
Ray tracy can be cheaper for equivalent lighting quality than rasternization. Depending on how they use it, it could be great to have rates and only just like how mega texters work. People got upset about the g p u memory requirements for mega texters , but it was a huge gain in performance if you hit the minimum. Retracing as an effect on top of rasterized ighting is a big hit to performance and the only thing we have now.
Native real-time ray tracing was released like 20 years ago with ati x1000 series. No one wanted to risk making a retracing only games so it never took on. Rtx is based on using ray tracing as an effect to go on top of rasternized graphics.
No current games use retracing only, indiana jones uses it for mandatory effects and not as the primary render method.