this post was submitted on 16 Nov 2025
121 points (96.9% liked)

Selfhosted

53016 readers
1156 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

How's your stuff doing? Unplanned interruptions or achieving uptime records?

I'm currently sailing rather smooth. Most of my stuff is migrated to Komodo, there will stay some exceptions and I only have to migrate Lemmy itself I think. Of course that's when I found a potential replacement but I'll let it sit for a while before touching it again. Enjoying the occasional Merge Request notification from the Renovate Bot and knowing my stuff is mostly up to date.

I'm thinking about setting up some kind of Wiki for my other niche hobby (Netrunner LCG) lore as there's a fandom one that most people avoid touching and updating but since I likely won't have time to start writing some articles on my own as a kickoff I'm hesitant. Also not sure which wiki I'd choose as well.

you are viewing a single comment's thread
view the rest of the comments
[–] Witziger_Waschbaer@feddit.org 9 points 2 days ago (1 children)

I recently switched my phone from Android to GrapheneOS and now rely even more on my selfhosted services. Immich is such a great project. Still gotta figure out my music collection though, since switching from YT Music to Jellyfin. Most of it is sorted by date of purchase, because that worked best with my DJ workflow. Now I gotta bring it over to a folder structure that works for jellyfin. It seems like the answer is musicbrainz Picard, but I gotta figure out how to configure it.

Also been thinking about some AI ideas I'd like to try, but I have zero intention getting involved with openai, meta, google or whoever the fuck. So self hosting it is. But on what hardware? Option 1 seems to be to get some professional server board, CPU, ram and start with one RTX3090 and go from there with the option to hook up more GPUs. But a setup like that sounds like it would cost some serious money in electricity. Option 2 seems to be a Rzyen AI Max+ 395, configured with a fuckton of ram, available to the whole apu and as suchs usable for memory hungry models. This seems to be much much more power efficient. But its all integrated and I couldn't swap out components or upgrade in the future. Leaning towara option 2 atm, but maybe I'll just wait a bit longer and see what else comes up in the coming months.

[–] Lyra_Lycan@lemmy.blahaj.zone 2 points 2 days ago* (last edited 2 days ago) (1 children)

Nice.. I use ytdl-sub for downloading music, highly recommend it. You can write tag metadata but if you want embedded stuff I'd recommend trying beets. Running both as a user whose primary group matches Jellyfin is a must if you want stuff saved next to the video files.. The dev is also very active.

I just installed Ollama and use gemma3 for now. I wanted to use dolphin-mixtral but holy crap it wants more RAM than my entire setup

[–] irmadlad@lemmy.world 1 points 2 days ago

I wanted to use dolphin-mixtral but holy crap it wants more RAM than my entire setup

This is basically what I've found with self hosted AI. I just don't have the equipment for it. Would love to be able to host a selfcontained LLM, but alas, as you say, it eats up resources. FEED ME MAURICE!