this post was submitted on 21 Aug 2025
205 points (89.6% liked)

Selfhosted

50759 readers
739 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Some thoughts on how useful Anubis really is. Combined with comments I read elsewhere about scrapers starting to solve the challenges, I'm afraid Anubis will be outdated soon and we need something else.

you are viewing a single comment's thread
view the rest of the comments
[–] Dremor@lemmy.world 18 points 2 days ago (8 children)

Anubis is no challenge like a captcha. Anubis is a ressource waster, forcing crawler to resolve a crypto challenge (basically like mining bitcoin) before being allowed in. That how it defends so well against bots, as they do not want to waste their resources on needless computing, they just cancel the page loading before it even happen, and go crawl elsewhere.

[–] tofu@lemmy.nocturnal.garden 6 points 2 days ago (7 children)

No, it works because the scraper bots don't have it implemented yet. Of course the companies would rather not spend additional compute resources, but their pockets are deep and some already adapted and solve the challenges.

[–] Dremor@lemmy.world 13 points 2 days ago (5 children)

To solve it or not do not change that they have to use more resources for crawling, which is the objective here. And by contrast, the website sees a lot less load compared to before the use of Anubis. In any case, I see it as a win.

But despite that, it has its detractors, like any solution that becomes popular.

But let's be honest, what are the arguments against it?
It takes a bit longer to access for the first time? Sure, but that's not like you have to click anything or write anything.
It executes foreign code on your machine? Literally 90% of the web does these days. Just disable JavaScript to see how many website is still functional. I'd be surprised if even a handful does.

The only people having any advantages at not having Anubis are web crawler, be it ai bots, indexing bots, or script kiddies trying to find a vulnerable target.

[–] int32@lemmy.dbzer0.com 1 points 1 day ago

I use uMatrix, which blocks js by default, so it is a bit inconvenient to have to enable js for some sites. websites which didn't need it before, which is often the reason I use them, now require javascript.

load more comments (4 replies)
load more comments (5 replies)
load more comments (5 replies)