Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (donβt cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
Needing that much RAM is usually a red flag that the algo is not optimized.
Researchers always make some of the worst coders unfortunately.
Scientists, pair up with an engineer to implement your code. You'll thank yourself later.
*looking at the 14TB cluster I had running for 18 hours
Yep, nobody would ever need that much memory
Wow, yea I think you win that contest lol.
To be honest, it was a very paralell process. I could do a fraction of the compute, needing a fraction of the RAM, but taking a shit ton more time.
Also, theres no perfect machine for this use. I can have 3.5 times more RAM than needed, or start swapping and waste time.
Nope. Some algorithms are fastest when a whole data set is held into memory. You could design it to page data in from disk as needed, but it would be slower.
OpenTripPlanner as an example will hold the entire road network of the US in memory for example for fast driving directions, and it uses the amount of RAM in that ballpark.
Sure, that is why I said usually. The fact that 2 people replied with the same OpenStreetMap data set is kinda proving my point.
Also, do you need the entire US road system in memory if you are going somewhere 10 minutes away? Seems inefficient, but I am not an expert here. I guess it is one giant graph, if you slice it up, suddenly there are a bunch of loose ends that break the navigation.
I host routing for customers across the US, so yes I need it all. There are ways to solve the problem with less memory but the point is that some problems really do require a huge amount of memory because of data scale and performance requirements.