this post was submitted on 15 Jul 2025
6 points (75.0% liked)

Linux

8428 readers
270 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
top 2 comments
sorted by: hot top controversial new old
[โ€“] sga@lemmings.world 2 points 9 hours ago

He seems overly dramatic about it. and some of the things are factually incorrect. for example, he says we functionally do the same things, but we do not, we essentially now run much more complex software in browsers. the best way that i can put it, is, browsers are practically virtual machines, which run softwate like word processors and meetings. I am not encouraging anything here, I myself am the kind of guy who refuses to use web applications, but I understand why people use them. in the broll, there was some example of a few tabs open, and only 2 gigs of memory usage. not sure exactly how old that clip is, but things have changed, many for better. for example, your browsers ususally limit memory usage to half of total available (be it 4 gig, or 8 gig or 16 gig (tthere are now some better/finer things related to exact limit, but lets not go there)), and if you have larger amount of ram available, your browser caches more (save future cpu/gpu cycles). Also, we now have better sandboxing and better isolation of tabs, which results in duplication of assets, but better security. you can still do some manual tweaking (i do, for example, disable js by default, or instead of per tab isolation, i have per site instace, which is less secure, but more efficient, plenty more), nobody stops you from doing that.

I do get the hatred for cloud shift. I dislike it too, and maintain offline stuff. but that has nothing to do with hardware performance decreasing. if anything, it would lead to less memory/storage required, as most of compute will be done in server. you would have lighter machines, with just display and inputs. and this was the exact model of computing 40yrs ago, servers/mainframes and weaker terminals.

if you want to complain about stuff, you can complain about excess use of js, or writing desktop stuff in js (this while is worse than using some compiled languages, it is not that bad), or about the amount of things that now want your attention. you can rant about bad tech practices, but not, about comparing a modern web + video editing, as opposed to older static sites

[โ€“] teppa@piefed.ca 1 points 1 hour ago* (last edited 1 hour ago)

I dont think websites is a good example, Javascript barely existed back in the day, now you've pages that look like animated books as you scroll. Videos also used to be 120p, now they are 4k.

But I'm at like 2-5% cpu usage with firefox and many tabs open, KDE, a file manager, and software center; most of the usage seems to be from the task manager itself. I think its likely some high level language like Javascript slowing things down, which is done to sandbox things and sterilize third party code thats run without vetting.