this post was submitted on 24 Mar 2025
861 points (95.3% liked)
Technology
67987 readers
3198 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I see no legitimate reason for not using a User Agent string, like all the other crawlers use, other than the desire to hide the crawler and make it difficult to block.
I don't accept his explanation. I see it as gaslighting.
Why should the crawler be blockable? That only brings disadvantages for a search engine. There is no sensible reason to allow Google but exclude other search engines.
It's not about 'Google' vs 'the other search engines'. It's about transparency. You've probably read some news about how AI crawlers have been destroying infrastrucure and half the time does NOT declare themselves as crawlers in their UA.
Can confirm that nealy 90% (read hundreds of thousands) of daily visits to several of my websites are made by crawlers from datacenters and I HATE not knowing whose who. Because when I don't know, I block and report. Website owners already have enough between AI, Page Rankings, and Research Agencies who all exploit free infra for their own business.
Do I make exceptions for Search Engine crawlers? Yeah, I do. I've seen Google, Bing, and Mojeek, but weirdly enough, never Brave. Now I know why. And frankly, if they can't be bothered to be transparent about their crawlings, then I won't be bothered to make exceptions for them. They're freeloading just as much as the rest. If they act like shady chinese crawlers, then they have no right to go pikachu face when they're treated like one.
Well said
Brave doesn't have AI crawlers, they have search index crawlers.
While you may make exceptions for them, many others may not.
They explained the reason in the comment you just replied to.