This is very much untrue
Lojcs
Cortex A55 is 2 generations old and from 2017. A76 is 6 generations old and from 2018.
Hope someone goes over the libreoffice ui to simplify its workflows and fix multi monitor support by then. That youtuber who designed musescore's new version comes to mind
The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it's not buffering whole frames before displaying them.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
And sometimes less, like when vsync is disabled.
That's not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That's 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora' mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn't miss vsync
Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
That's why I specified 60hz :)
I see that you meant TVs specifically but I think it is misleading to call processing delays 'inherent' especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.
First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that's a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
Edit: to be clear, this is the screen's refresh rate, the game doesn't need to run at hfr to benefit.
Gooner bait trailer makes me hope this thing flops.
The CUBE, a new multiplayer RPG shooter
... and I'm completely disinterested.
I've been playing it the last week
Surely digital foundry will find improvements but to me this doesn't look that different than witcher 3. Better animations and hair looks better I think? Never noticed lod transitions myself. The promise of increaaed interactivity means nothing in a trailer neither. I only hope it has better multi core performance especially with ray tracing
I don't think so. Most of the generated heat is dissipated instantly. The processor warms up until either it's running at the necessary load or reached its max temp. The temps could increase slightly over time due to the air / liquid temperature (and non heat generating parts of the computer) heating up but that won't dissipate in a minute
Fair about the years, let's call them 2019 models instead.
I do doubt rk3588's performance because my phone has a76 + a55 and it has nowhere near desktop performance like the person above claimed. And for people with more recent phones it would be a downgrade.
The soc being merely 3 years old despite having older components doesn't mean much; you could make the same arguement about the phone being brand new regardless of what cpu it has but that doesn't change the fact that it doesn't perform like a 2025 device. This is like amd repackaging zen 2 and zen 3 processors as if they were new by declaring the most significant digit 'year' and 3rd most significant digit 'generation'.
And I really doubt rockchip (or any other manufacturer) chose a76 because it was better than its successors in some way. As I see it rk3588 is using these cores because they're the best available on the 8nm process node that rockchip could afford. Which I suppose isn't unreasonable for a low cost chip in 2022, but this is a ~~2025~~2026 smartphone that's reducing costs by using a 2022 chip that itself reduced its costs by using 2019 tech.