this post was submitted on 28 Nov 2025
286 points (99.0% liked)

Selfhosted

53204 readers
2159 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] NotSteve_@piefed.ca 35 points 1 day ago (3 children)

What are you running that needs more than 32Gb? I'm only just barely being bottlenecked by my 24Gb when running games at 4k

[–] comrade_twisty@feddit.org 29 points 1 day ago

Chrome probably

[–] jeena@piefed.jeena.net 7 points 1 day ago

Two browsers full of tabs but that is not a problem, but once I start compiling AOSP (which I sometimes want to do for work at home instead in the cloud because it's easier and faster to debugg) then it eats up all the RAM imediatelly and I have to give it 40 more GB or swap and then this swapping is the bottleneck. Once that is running the computer can't really do anything else, even the browser struggles.

[–] hoshikarakitaridia@lemmy.world 6 points 1 day ago (2 children)

AI or servers probably. I have 40gb and that's what I would need more ram for.

I'm still salty because I had the idea of going cpu & ram sticks for AI inference literally days before the big AI companies. And my stupid ass didn't buy them in time before the prices skyrocketed. Fuck me I guess.

[–] NotMyOldRedditName@lemmy.world 6 points 1 day ago* (last edited 1 day ago) (2 children)

It does work, but it's not really fast. I upgraded to 96gb ddr4 from 32gb a year or so ago, and being able to play with the bigger models was fun, but it's not something I could do anything productive with it was so slow.

[–] possiblylinux127@lemmy.zip 5 points 1 day ago (1 children)

Your bottle necked by memory bandwidth

You need ddr5 with lots of memory channels for it to he useful

I always thought using ddr5 average speeds with like 64gb in sticks on consumer boards is passable. Not great, but passable.

[–] tal@lemmy.today 3 points 1 day ago* (last edited 1 day ago) (1 children)

You can have applications where wall clock tine time is not all that critical but large model size is valuable, or where a model is very sparse, so does little computation relative to the size of the model, but for the major applications, like today's generative AI chatbots, I think that that's correct.

[–] NotMyOldRedditName@lemmy.world 3 points 1 day ago* (last edited 1 day ago)

Ya, that's fair. If I was doing something I didn't care about time on, it did work. And we weren't talking hours, it it could be many minutes though.

[–] panda_abyss@lemmy.ca 2 points 1 day ago (1 children)

I’m often using 100gb of cram for ai.

Earlier this year I was going to buy a bunch of 1tb ram used servers and I wish I had.

[–] hoshikarakitaridia@lemmy.world 1 points 1 day ago (1 children)

Damn

Yeah used ram is probably where it's at. Maybe you get them used later on from data centers...

[–] kossa@feddit.org 1 points 2 hours ago

Yep, used ECC server RAM DDR3 or DDR4 is basically thrown out. Unfortunately most consumer mainboards do not support ECC.