If you have a Nvidia graphics card 1070 and above, then openwebui. You can selfhost your own LLM. AMD is probably supported but haven't checked.
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
-
No low-effort posts. This is subjective and will largely be determined by the community member reports.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Maybe not a service in the typical sense, but setting up your router+server to route your home network traffic through a VPN is a fun project.
My router (MikroTik) supports WireGuard, so I can use it with Mullvad for the whole house---but wg is demanding and it's a slow router, so while it can NAT at ~1Gbps, it can't do WireGuard at more than ~90Mbps. So, I set up WireGuard/Mullvad on a little SBC with a fast processor, and have my router use that instead. Using policy based routing and/or mangling, I can have different VLANs/subnets/individual hosts selectively routed through the VPN.
It's a fun exercise, not sure I implemented it in a smart way, but it works :)
-
For LLM hosting, ik_llama.cpp. You can really gigantic models at acceptable speeds with its hybrid CPU/GPU focus, at higher quality/speed than mainline llama.cpp, and it has several built in UIs.
-
LanguageTool, for self run grammar/spelling/style checking.
I started with NextCloud, mainly so I can start synchronizing Joplin notes. Maybe I could hook it up to also sync Logseq?
I chose this VTT because it's dead simple and description on owlbear legacy did not sound encouraging
Then, on my list I have
RSSHub. Being able to get all my updates in one place changed how I view the internet for the better.
Adguard Home, with domain pointed to it and using it as Private DNS on Android. No more ads anywhere!