Stability of those fps are even more important. Modern games have problems with that.
No Stupid Questions
No such thing. Ask away!
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.
Credits
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
Because they all fuck with the frame timing in order to try to make the fps higher (on paper)
It's a few things, but a big one is the framerate jumping around (inconsistent frame time). A consistent 30fps feels better than 30, 50, 30, 60, 45, etc. Many games will have a frame cap feature, which is helpful here. Cap the game off at whatever you can consistently hit in the game that your monitor can display. If you have a 60hz monitor, start with the cap at 60.
Also, many games add motion blur, AI generated frames, TAA, and other things that really just fuck up everything. You can normally turn those off, but you have to know to go do it.
If you are on console, good fucking luck. Developers rarely include such options on console releases.
30 50 30 60 30... Thats FPS... Frametime means the time between each frame in this second.
Stuttering, but mostly it's the FPS changing.
Lock the FPS to below the lowest point where it lags, and suddenly it wont feel as bad since it's consistent.
EDIT: I completley skipped over that you used Fallout 4 as an example. That engine tied game speed and physics to fps last time I played. So unless you mod the game, things will literally move "slower" as the fps drops.
Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.
Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).
Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.
First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that's a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
Edit: to be clear, this is the screen's refresh rate, the game doesn't need to run at hfr to benefit.
I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.
The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing. Black frame insertion or frame doubling doubles the frame, cutting input delay roughly in half (not quite that because of overhead, but hopefully you get the idea).
This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.
Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.
Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
That's why I specified 60hz :)
I see that you meant TVs specifically but I think it is misleading to call processing delays 'inherent' especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.
True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.
Bear in mind I can’t pinpoint the specific issue for any given game but there are many.
Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
I mentioned TVs above re: post processing.
Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.
Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”
Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.
Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.
The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it's not buffering whole frames before displaying them.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
And sometimes less, like when vsync is disabled.
That's not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That's 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora' mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn't miss vsync
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
Essentially, the speed of the beam determined how many lines you could display, and the more lines you tried to display, the slower the screen was able to refresh. So higher resolutions would have lower max refresh rates. Sure, a monitor could do 120 Hz at 800x600, but at 1600x1200, you could probably only do 60 Hz.
old games animations were sometimes made frame by frame. like the guy who drew the character pixel by pixel was like "and in the next frame of this attack the sword will be here"
My favorite game of all time is Descent, PC version to be specific, I didn't have a PlayStation when I first played it.
The first time I tried it, I had a 386sx 20MHz, and Descent, with the graphics configured at absolute lowest size and quality, would run at a whopping 3 frames per second!
I knew it was basically unplayable on my home PC, but did that stop me? Fuck no, I took the 3 floppy disks installer to school and installed it on their 486dx 66MHz computers!
I knew it would just be a matter of time before I got a chance to upgrade my own computer at home.
I still enjoy playing the game even to this day, and have even successfully cross compiled the source code to run natively on Linux.
But yeah I feel you on a variety of levels regarding the framerate thing. Descent at 3 frames per second is absolutely unplayable, but 20 frames per second is acceptable. But in the world of Descent, especially with modern upgraded ports, the more frames the better 👍
Great games. Free space was almost mind blowing when I first played it as well
I haven't actually played Free Space before, but I did manage to get a copy and archive it a few years ago.
I also got a copy of Overload and briefly tried that, but on my current hardware it only runs at about 3 frames per second...
The Descent developers were really ahead of their time and pushing gaming to the extreme!
Definitely give it a shot. It's obviously different, but I loved it. My mom actually banned me from playing descent 3: vertigo, because she had vertigo and it made her sick
Vertigo was actually an expansion on Descent 2, I made the NoCD patch for it via a carefully hex edited mod based on another NoCD patch for the original Descent 2.
Any which way, yeah, anyone with vertigo wouldn't be comfortable or oriented in any way if they're watching or playing the game, no matter what version.
Shit you're right. It's been too long
Descent broke my brain. I'm pretty good at navigating in FPS', but Descent's 4 axis of movement just didn't click for me. I kept getting lost, recently I tried it again after many years, I just can't wrap my head around it.
Same with space sims. I'm dog awful in dog fights.
Indeed, it's not quite a game for everyone, especially if you're prone to motion sickness. Initially it only took me about a half hour to get a feel for the game, but configuring the controls can still be a headache.
Every time I set the game up on a new or different system, I tend to usually go for loosely the same sort of controls, but each new setup I might change up the controls a bit, like an endless guessing and testing game to see what controls might be ideal, at least for me.
By the way, Descent is considered a 6 Degrees Of Freedom game, not 4. But hey, at least they have a map feature, I'd go insane without the map sometimes..
I meant 6, not sure why I typed 4.
Descent is pretty fun. Not as big of a fan as you are, but I definitely dig it.
It depends on what you're running, but often if the frame rate is rock solid and consistent it helps it feel a lot less stuttery. Fallout games are not known for their stability and well functioning unfortunately.
For comparison, deltarune came out a few days ago, that's locked to 30 fps. Sure it's not a full 3D game or anything, but there's a lot of complex motion in the battles and it's not an issue at all. Compared to something like bloodborne or the recent Zeldas, even after getting used to the frame rate they feel awful because they're stuttering all the damn time.
I think it has something to do with frame times.
I'm not an expert but I feel like I remember hearing that low framerate high frametime feels worse than low framerate low frametime. Something to do with the delay it takes to actually display the low framerate?
As some of the other comments mentioned, it's probably also the framerate dropping in general too.
Two things that haven't quite been mentioned yet:
1) Real life has effectively infinite FPS, so you might expect that the closer a game is to reality, the higher your brain wants the FPS to be in order for it to make sense. This might not be true for everyone, but I imagine it could be for some people.
More likely: 2) If you've played other things at high FPS you might be used to it on a computer screen, so when something is below that performance, it just doesn't look right.
These might not be entirely accurate on their own and factors of these and other things mentioned elsewhere might be at play.
Source: Kind of an inversion of the above: I can't focus properly if games are set higher than 30FPS; It feels like my eyes are being torn in different directions. But then, the games I play are old or deliberately blocky, so they're not particularly "real" looking, and I don't have much trouble with real life's "infinite" FPS.
Part of it is about how close you are to the target FPS. They likely made the old N64 games to run somewhere around 24 FPS since that was an extremely common "frame rate" for CRT TVs common at the time. Therefore, the animations of, well, basically everything that moves in the game can be tuned to that frame rate. It would probably look like jank crap if they made the animations have 120 frames for 1 second of animation, but they didn't.
On to Fallout 4... Oh boy. Bethesda jank. Creation engine game speed is tied to frame rate. They had several problems with the launch of Fallout76 because if you had a really powerful computer and unlocked your frame rate, you would be moving 2-3 times faster than you should have been. It's a funny little thing to do in a single-player game, but absolutely devastating in a multi-player game. So, if your machine is chugging a bit and the frame rate slows down, it isn't just your visual rate of new images appearing that is slowing down, it's the speed at which the entire game does stuff that slowed down. It feels bad.
And also, as others have said, frame time, dropped frames, and how stable the frame rate is makes a huge difference too in how it "feels".
Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.
I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps
The display being at a higher resolution doesnt help either. Running retro games on my fancy flatscreen hi-def massive TV makes them look and feel so much worse than on the smaller fuzzy CRT screens of the time.
I can't stand modern games with lower frame rates. I had to give up on Avowed and a few other late titles on the series S because it makes me feel sick when turning the camera. I assume most of the later titles on xbox will be doing this as theyre starting to push what the systems are capable of and the series S can't really cope as well.
Are you using an OLED screen?
I had to tinker with mine a fair bit before my PS1 looked good on it.
Some old games are still pretty rough with their original frame rate. I recently played 4 player golden eye on an n64, and that frame rate was pretty tough to deal with. I had to retrain my brain to process that.
Out of curiosity, did you have an actual N64 hooked up to a modern TV? A lot of those old games meant to be played on a CRT will look like absolute dog shit on a modern LCD panel. Text is harder to read, it is harder to tell what a shape is supposed to be, it's really bad.
Trinatron baby
Probably consistency.
Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.
Modern games fluctuate so much that it pulls you right out.
Well a good chunk of it is that older games only had ONE way they were played. ONE framerate, ONE resolution. They optimized for that.
Nowadays they plan for 60, 120, and if you have less too bad. Upgrade for better results.
FPS counters in games usually display an average across multiple frames. That makes the number actually legible if the FPS fluctuates, but if it fluctuates really hard on a frame-by-frame, it might seem inaccurate. If I have a few frames here that were outputted at 20 FPS, and a few there that were at 70 FPS, the average of those would be 45 FPS. However, you could still very much tell that the framerate was either very low or very high, which would be perceived as stutter. Your aforementioned old games probably were frame-capped to 20, while still having lots of processing headroom to spare for more intensive scenes.
Bro when Majora's mask came out nothing was 60fps lol. We weren't used to it like how we are today. I'm used to 80fps so 60 to me feels like trash sometimes.
Ackshuli -- By late 2000 there were a couple games on PC that could get there.
.... If you were playing on high-end hardware. Which most PC gamers were not. (despite what Reddit PCMR weirdos will tell you, PC gaming has always been the home for janky hand-built shitboxes that are pushed to their crying limits trying to run games they were never meant to)
Regardless that's beside the point -- The original MM still doesn't feel bad to go back to (it's an annual tradition for me, and I alternate which port I play) even though it never changed from its 20FPSy roots.
Yeah but even now you can go back and play Majora's mask, and it not feel bad.
But as mentioned the real thing is consistancy, as well as the scale of action, pace of the game etc... Zelda games weren't sharp pinpoint control games like say a modern FPS. Gameplay was fairly slow. and yeah second factor is simply games that were 20FPS, were made to be a 100% consistant 20 FPS. A game locked in at 20, will feel way smoother than one that alternates between 60 and 45
No more optimizations. This must then be compensated for with computing power, i.e. by the end user. These are cost reasons. Apart from that, the scope has become much larger, making optimizations more time-consuming and therefore more expensive. In the case of consoles, there is also the fact that optimizations have to be made specifically for a hardware configuration and not, as with PCs, where the range of available components is continuously increasing. Nevertheless, the aim is to cut costs while maximizing profits.
Bro when Majora's mask came out nothing was 60fps lol
Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.
Actually, 60.0988fps according to speed runners.
The TV might refresh the screen 60 times per second (or actually refresh half the screen 60 times per second, or actually 50 times per second in Europe), but that’s irrelevant if the game only throws 20 new frames per second at the TV. The effective refresh rate will still be 20Hz.
That’s just a possible explanation. I don’t know what the refresh rate of Majora’s Mask was.
I'm pretty sure the 16-bit era were generally 60FPS
Framerates weren't really a
Thing.
Before consoles had frame-buffers -- Because Framebuffers are what allow the machine to build a frame of animation over several VBlank Intervals before presenting to the viewer.
The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.
Before that, you were in beam-racing town.
If your processing wasn't enough to keep up with the TV's refresh rate (60i/30p in NTSC territories, 50i/25p in PAL) -- Things didn't get stuttery or drop frames like modern games. They'd either literally run in slow-motion, or not display stuff (often both, as anyone who's ever played a Shmup on NES can tell you)
You had the brief window of the HBlank and VBlank intervals of the television to calc stuff and get the next frame ready.
Buuuut, as of the PSX/N64/Saturn, most games were running anywhere between 15 and 60 FPS, with most sitting at the 20s.
PC is a whole different beast, as usual.
i think you're mixing up a few different things here. beam-racing was really only a thing with the 2600 and stopped once consoles had VRAM, which is essentially a frame-buffer. but even then many games would build the frame in a buffer in regular RAM and then copy everything into VRAM at the vblank. in other cases you had two frames in VRAM and would just swap between them with a pointer every other frams. if it took longer than one frame to build the image, you could write your interrupt handler to just skip every other or every three vblank interrupts, which is how a game like super hang-on on the megadrive runs at 15 FPS even though the VDP is chucking out 60 frames a second. you could also disable interrupts when the buffer was still being filled, which is how you end up with slowdown on certain games when too many objects were on the screen. too many objects could also lead to going over the limits of how many sprites you can have on a scanline, which is why things would vanish- bit that is it's own seperate issue. if you don't touch VRAM between interrupts then the image shown last frame will show this frame as well
F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.
Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.
I think player expectations play a big role here. It's because you grew up with 20fps on ocarina of time that you accept how it looks.
I'm pretty sure that game is not a locked 20 FPS and can jump around a bit between 15-20, so the argument that it is locked 20 and so feels smooth doesn't really convince me.
If that game came out today as an indie game it would be getting trashed for its performance.