this post was submitted on 23 Jun 2025
769 points (97.8% liked)

Greentext

6636 readers
1384 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] latenightnoir@lemmy.blahaj.zone 26 points 1 week ago* (last edited 1 week ago) (23 children)

The first F.E.A.R. had excellent dynamic lighting, I'd argue it had the epitome of relevant dynamic lighting. It didn't need to set your GPU on fire for it, it didn't have to sacrifice two thirds of its framerate for it, it had it all figured out. It did need work on textures, but even those looked at least believable due to the lighting system. We really didn't need more than that.

RT is nothing but eye candy and a pointless resource hog meant to sell us GPUs with redundant compute capacities, which don't even guarantee that the game'll run any better! And it's not just RT, it's 4k textures, it's upscaling, it's Ambient Occlusion, all of these things hog resources without any major visual improvement.

Upgraded from a 3060 to a 4080 Super to play STALKER 2 at more than 25 frames per second. Got the GPU, same basic settings, increased the resolution a bit, +10 FPS... Totes worth the money...

Edit: not blaming GSC for it, they're just victims of the AAA disease.

Edit 2: to be clear, my CPU's an i7, so I doubt it had much to do with the STALKER bottleneck, considering it barely reached 60% usage, while my GPU was panting...

Edit 3: while re-reading this, it hit me that I sound like the Luddite Boss, so I need to clarify this for myself more than anyone else: I am not against technological advancement, I want tech in my eyeballs (literally), I am against "advancements" which exist solely as marketing accolades.

[–] AdrianTheFrog@lemmy.world 2 points 1 week ago (3 children)

Really? Ambient occlusion used to be the first thing I would turn on. Anyways, 4k textures barely add any cost to the GPU. That's because they don't use any compute, just vram, and vram is very cheap ($3.36/GB of GDDR6). The only reason consumer cards are limited in vram is to prevent them from being used for professional and AI applications. If they had a comparable ratio of vram to compute, they would be an insanely better value compared to workstation cards, and manufacturers don't want to draw away sales from that very profitable market.

[–] latenightnoir@lemmy.blahaj.zone 0 points 23 hours ago (2 children)

Honestly, I find 4k in general to be entirely unnecessary. Anything above 1440p is redundant imho, as diminishing returns kick in very fast as the density increases.

[–] AdrianTheFrog@lemmy.world 1 points 15 hours ago (1 children)

For the resolution of the texture to need to be doubled along each axis, you could either have a monitor with twice the resolution, or you could be half the distance away. Most of the time games will let you get closer to objects than their texture resolution looks good for. So 4k textures still give an improvement even on a 1080p monitor.

Texture resolution is chosen on a case by case basis, objects that the player will usually be very close to have higher resolution textures, and ones that are impossible to approach can use lower resolution.

The only costs to including higher resolution textures are artist time and (usually) disk space. Artist time is outsourced to countries with cheap labor (Philippines, Indonesia, etc) and apparently no-one cares about disk space.

[–] latenightnoir@lemmy.blahaj.zone 1 points 10 hours ago* (last edited 10 hours ago)

It is such a minor improvement, though. Especially, as you've mentioned, on a 1080p screen. And, sorry to say, but having to fill up 150+ Gigs of space for a 20-30 hour game is a complete waste with, as I've said several times, immensely diminished returns.

Let's take UE5, for instance. Texture streaming has become necessary because we can't afford to have a coherent texture map loaded in all at once, precisely because those textures are stupidly large. This causes so, so many performance issues, that I'd argue it's been a downgrade even coming off UE3.

I'd rather have a smooth and OK-looking game which doesn't require me to get a second SSD just to fit it all, than to face significant stutters every time I do a 180° turn because the engine is struggling to load Gigs of uncached textures, which, again, barely look any better.

load more comments (19 replies)