We’ve gotten so good at faking most lighting effects that honestly RTX isn’t a huge win except in certain types of scenes.
Greentext
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
The difference is pretty big when there are lots of reflective surfaces, and especially when light sources move (prebaked shadows rarely do, and even when, it's hardly realistic).
A big thing is that developers use less effort and the end result looks better. That's progress. You could argue it's kind of like when web developers finally were able to stop supporting IE9 - it wasn't big for end users, but holy hell did the job get more enjoyable, faster and also cheaper.
Cyberpunk and Control are both great examples - both games are full of reflective surfaces and it shows. Getting a glimpse of my own reflection in a dark office is awesome, as is tracking enemy positions from cover using such reflections.
But, it takes a lot of work by designers to get the fake lighting to look natural. Raytracing would help avoid that toil if the game is forced RT.
Gamers needs expensive hardware so designer has less work. Game still not cheaper.
I took pickes and tomatoes off my burger, where's my $0.23 discount damn it?!
Let's assume cutting out tomatoes and pickles saved $0.23 per hamburger.
McDonald's serves 6.5 million hamburgers a day.
That's $500 million extra yearly profit for their shareholders.
Maximise your RTX performance with this one crazy hack!
Ray traced reflections: on
Ray traced everything else: off
I'd argue reflections are nowhere near as nice looking as RTGI. If anything, switch reflections off.
But muh puddles! Night City is nothing without those gorgeous, mirror–like puddles.
Baked lighting looks almost as good as ray tracing because, for games that use baked lighting, devs intentionally avoid scenes where it would look bad.
Half the stuff in this trailer (the dynamically lit animated hands, the beautiful lighting on the moving enemies) would be impossible without ray tracing. Or at the least it would look way way worse:
Raytracing is cool, personaly I feel like the state that consumers first got it in was atrocious, but it is cool. What I worry about is the ai upscale, fake frame bullshit. While it's cool that the technology exists; like sweet, my GPU can render this game at a lower resolution, then upscale it back at a far better frame rate than without upscaling, ideally stretching out my GPU purchase. But I feel like games (in the AAA scene at least) are so unoptimized now, you NEED all of these upscaling, fake frame tricks. I'm not a Dev, I don't know shit about making games, just my 2 cents.
Raytracing will be cool if hardware can catch it up. It's pretty pointless if you have to play upscaled to turn the graphics up. And as you say, upscaling has its uses and is great tech, but when a game needs it to not look like dogshit (looking at you Stalker 2) it worries me a lot.
No you've pretty much hit it on the head there. The higher ups want it shipped yesterday, if you can ship it without fixing those performance issues they're likely going to make you do that.
I think raytracing is fine for games that want a lot of realism. But I'm playing games with monsters and fantasy. My suspension of disbelief isn't going to break because reflections aren't quite right.
But I'm pretty much in the camp of, I want my games to look and feel like games. I like visual cues like highlighting items I can interact with or pick up. So lighting is always non-realistic.
It's not a trick, it's just lighting done the way it should be done without all the tricks we need now like Subsurface scattering or Screen space reflections.
The added benefit is that materials reflect more of their natural reflection making all the materials look more true to life.
Its main drawback is that it's GPU costly, but more and more AAA games are now moving toward RT as standard by being more clever in how it handles its calculations.
Yes, I'm sure every player spends the majority of their game time admiring the realistic material properties of Spider-Man's suit. So far I've never seen a game that was made better by forcing RT into it. A little prettier if you really focus on the details where it works, but overall it's a costly (in terms of power, computation, and price) gimmick.
The one benefit I see is that it simplifies lighting for the developer by a whole lot.
Which isn't a benefit at all, because as of now, they basically have to have a non-raytrace version so 90% of players can play the game
But in a decade, maybe, raytracing will make sense as the default
RT also makes level-design simpler for the development team as they can design levels by what-you-see-is-what-you-get method rather than having to bake the light sources.
Development and design can use RT all day long, that's not the issue. They have the benefit of not having to run ray tracing in real time on consumer hardware. At the end of the day, unless they want to offload all of that computation load onto the customer forever (and I really mean all RT all the time), they'll eventually have to bake most or all of that information into a format that a rasterizer can use.
Subsurface scattering is not one of the things you get automatically with ray tracing. If you just bounce the rays off objects as would be the usual first step in implementing ray tracing you don't get any light penetration into the object, so none of that depth.
Maybe you meant ambient occlusion?
raytracing still needs to do subsurface scattering. It can actually do it for real though. It also "wastes" a lot of bounces, so is usually approximated anyway
Soooo, there's a missing part here. The point (and drive) behind raytracing isn't making games beautiful, it's making them cheaper and less man-hour intensive to make/maintain.
The engine guys spend manyears every year working on that non-raytraced engine so it can do 150. They've done every cheat, every side step, and spent every minute possible making it look like they haven't done anything at all.
The idea is that they stop making/updating/supporting non-raytracing engines and let the GPU's pick up the slack. Then using AI to artificially 'upgrade' the frame rate with interpolation.
Don't forget that temporal smear. I like to apply vaseline directly onto my monitor instead.
The first F.E.A.R. had excellent dynamic lighting, I'd argue it had the epitome of relevant dynamic lighting. It didn't need to set your GPU on fire for it, it didn't have to sacrifice two thirds of its framerate for it, it had it all figured out. It did need work on textures, but even those looked at least believable due to the lighting system. We really didn't need more than that.
RT is nothing but eye candy and a pointless resource hog meant to sell us GPUs with redundant compute capacities, which don't even guarantee that the game'll run any better! And it's not just RT, it's 4k textures, it's upscaling, it's Ambient Occlusion, all of these things hog resources without any major visual improvement.
Upgraded from a 3060 to a 4080 Super to play STALKER 2 at more than 25 frames per second. Got the GPU, same basic settings, increased the resolution a bit, +10 FPS... Totes worth the money...
Edit: not blaming GSC for it, they're just victims of the AAA disease.
Edit 2: to be clear, my CPU's an i7, so I doubt it had much to do with the STALKER bottleneck, considering it barely reached 60% usage, while my GPU was panting...
Edit 3: while re-reading this, it hit me that I sound like the Luddite Boss, so I need to clarify this for myself more than anyone else: I am not against technological advancement, I want tech in my eyeballs (literally), I am against "advancements" which exist solely as marketing accolades.
Early 3D graphic rendering was all ray-tracing, but when video games started doing textured surfaces the developers quickly realised they could just fake it with alpha as long as the light sources were static.
I never turn it on, the visual difference is too unimportant to warrant such a huge cost in hardware resources (and temperature). It looks different if you have side-by-side screenshots, or if you turn it off and on in-game, but if the difference is several orders of magnitude too slight to be worth it. Higher frames-per-second is more important than realistically-simulated light beams. You can't really have both in large AAA games.
I have seen FEW games that really benefit from RT. RT is a subtle effect because we'we got pretty good at baking and faking how light should look.
But even if its just a subtle effect, it adds so much, the feeling of the lighting is (for me) better wit RT, light properly propagates, bounces, dynamic geometry is properly lit. It's just so much of these, on the bigger scale, tiny upgrades that make the lighting look a lot better.
It just sucks that the performance is utter shit right now. I hope in few years this will be optimised and we won't need to sacrifice 1\2 of the framerate just to get lighting that feels right.
as someone who has worked in visual fx for 20 years now, including on over 15 films and 8 games, raytracing is most definitely not simply a marketing tool.
The best examples of raytracing are in applying it to old games, like Quake II or Portal or Minecraft.
Newer games were already hitting diminishing returns on photo realism. Adding ray tracing takes them from 95% photo realistic to 96%.
I disagree - adding RT to games that weren’t designed for it often (but not always) wrecks the original art direction.
Quake II is a great example; I think the raytraced version looks like absolute ass. Sure, it has fancy shadows and reflections, but all that does is highlight how old the assets are.
Portal with ray tracing is a really cool demo, and Ive used it on the past to show off ray tracing. But man its just not as pretty as the old portal because it lacks the charm, its like those nature photos that are blown out with HDR
Ray tracing is just a way for nvidia to proprietize a technology then force the industry to use it all to keep Jensen in leather jackets. Don't buy his cards; he has too many leather jackets!
But maybe finally games will get working mirrors again.
When I had a PS5 and Cyberpunk, I would sometimes switch ray tracing on and off to see if it made a huge difference. Well, the frame rate would be capped at 30 with it on...and I suppose if I stopped and looked around for a bit, it was noticeable, but honestly, I preferred the higher framerate. I've yet to see a game that really benefits from RT.
Skyrim has "ray traced" shadows in certain places and works great. I was in a cave once and hiding behind a cliff. An enemy was wandering around the next room and I was able to use the shadow cast on him by a torch to observe his movements without having his actual body in my field of view.
All this modern RT nonsense does is make things look slightly better than screen space reflections and tank performance.
That's actually one specific torch!
It is unknown why it has this function, or why Bethesda left it in
Just Bethesda things
I would expect that to be a normal rasterized shadow map unless you can find any sources explicitly saying otherwise. Because even 1 ray per pixel in complex triangulated geometry wasn't really practical in real time until probably at least 2018
rt is a marketing trick very few games are made in a way that makes it look better
Raytracing is being pushed so hard by the industry because it makes things easier for devs as opposed to making the games look better for the customer.
There is absolutely nothing about raytracing which makes it "easier" for devs compared to a traditional render pipeline.
The extra performance rquirements alone mean you're going to be doing more work elsewhere to make up for it, and that's ignoring the current bugs/quirks with RT in whatever engine you're using.