this post was submitted on 23 Jun 2025
607 points (97.6% liked)
Greentext
6546 readers
1067 users here now
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes, I'm sure every player spends the majority of their game time admiring the realistic material properties of Spider-Man's suit. So far I've never seen a game that was made better by forcing RT into it. A little prettier if you really focus on the details where it works, but overall it's a costly (in terms of power, computation, and price) gimmick.
The one benefit I see is that it simplifies lighting for the developer by a whole lot.
Which isn't a benefit at all, because as of now, they basically have to have a non-raytrace version so 90% of players can play the game
But in a decade, maybe, raytracing will make sense as the default
I've always said that, because the baseline GPUs are the RTX 3060 and the RX 6700 (consoles equivalent).... And those GPUs aren't doing amazing RT so, what's the point in pushing it so hard NOW for the 1% of users with a 4090 or whatever?
Hell, I got a RX 7900 XTX and for some games, my fps are not consistently at 60 fps
RT also makes level-design simpler for the development team as they can design levels by what-you-see-is-what-you-get method rather than having to bake the light sources.
Development and design can use RT all day long, that's not the issue. They have the benefit of not having to run ray tracing in real time on consumer hardware. At the end of the day, unless they want to offload all of that computation load onto the customer forever (and I really mean all RT all the time), they'll eventually have to bake most or all of that information into a format that a rasterizer can use.
Where is RTX being forced into? Haven't seen a game where it's not an option you have to toggle on first and it's not like RTX is a lot of additional work for the developer, seeing how it in fact reduces the work necessary to make a scene look the way it should.
Yes, it's stupidly expensive and not every game manages to benefit massively from it, but it can lead to some very pretty environments in games and it seems perfectly valid in those cases.
Also, some people do quite enjoy admiring the way the materials of various things end up looking. Maybe it's not the majority of players, but some people quite like looking at details in the games they play.
There aren't many but the new Indiana Jones and Doom games require ray tracing
To be fair... At least those 2 actually perform well.
Indiana Jones can run at high settings 1080p NATIVE at like 80 fps on a 3060, and Doom ran at like 80 FPS medium settings quality upscaled 1440p on my RX 6800XT which is like bad for RT lol