Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
I just realized an interesting thing - if I use Gemini, and tell it to do deep research, it actually goes to the websites it knows/finds, and looks up the content to provide up-to-date answers. So, some of those AI crawlers are actually not crawlers, but actual users who just use AI instead of coming directly to the site.
Soo... blocking AI completely could also potentially reduce exposure, especially as more and more people use AI to basically do searches instead of browsing themselves. That would also explain the amount of requests daily - could be simply different users using AI to research for some topic.
Point is, you should evaluate if the AI requests are just proxies of real users, and blocking AI blocks real users from knowing your site exists.
Normally, websites want users to come to their site, instead of an AI search engine "stealing" the content and presenting it as it's own. Yes, AI search engines are more convenient for the user, but in the end it will discourage website creators and thereby cut of it's own "food supply".
I understand, but the shift in user behaviour is significant and I think websites are not taking it into account. If the users move more and more to AI, and since Google introduced AI mode it's only a question of time until it becomes the default, we will see more and more of what we thing are AI crawlers and less and less organic users.
AI seems to be the new middleman between you and the user, and if you block the middleman, you block the user. For people with hobby websites or established sites it may make sense because people either know of them, or getting more exposure is not a wish or requirement, but for everyone else, it will be painful.
So, what I'm reading is, if your "users" are bad (or bots), just get better users.
Sounds like a net win.
I honestly don't think most people replace search with AI, it will also slowly solve itself when google injects ads into the output.
We all understand that. But if those users keep insisting on giving everyone their life story and current option in world politics before giving us the bread recipe we came for, they can fade away.
Porque no los dos?
There is no functional difference between them scraping you systematically and them coming to you on behalf of user. They're coming to scrape you either way, being asked by someone is just going to make them do it in a smarter fashion.
Also, if you're not using Gemini, damned if Google.com doesn't search you with it anyway. They want these AIs trained bad, sooner or later almost all searching will be done through AI. There will eventually be no option.
You are correct that blocking all AI calls well eventually make your search results not work.
So if you want organic traffic, you have to allow ai scraping eventually. You're just going to get diminishing returns until a point.
this does not really apply because i run some frontends so there is not really any information that ai needs