this post was submitted on 04 Mar 2025
615 points (98.9% liked)
Technology
72338 readers
2552 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I get why you're frustrated and you have every right to be. I'm going to preface what I'm going to say next by saying I work in this industry. I'm not at Cloudflare but I am at a company that provides bot protection. I analyze and block bots for a living. Again, your frustrations are warranted.
Even if a site doesn't have sensitive information, it likely serves a captcha because of the amount of bots that do make requests that are scraping related. The volume of these requests can effectively DDoS them. If they're selling something, it can disrupt sales. So they lose money on sales and eat the load costs.
With more and more username and password leaks, credential stuffing is getting to be a bigger issue than anyone actually realizes. There aren't really good ways of pinpointing you vs someone that has somehow stolen your credentials. Bots are increasingly more and more sophisticated. Meaning, we see bots using aged sessions which is more in line with human behavior. Most of the companies implementing captcha on login segments do so to try and protect your data and financials.
The rise in unique, privacy based browsers is great and it's also hard to keep up with. It's been more than six months, but I've fingerprinted Pale Moon and, if I recall correctly, it has just enough red flags to be hard to discern between a human and a poorly configured bot.
Ok, enough apologetics. This is a cat and mouse game that the rest of us are being drug into. Sometimes I feel like this is a made up problem. Ultimately, I think this type of thing should be legislated. And before the bot bros jump in and say it's their right to scrape and take data it's not. Terms of use are plainly stated by these sites. They consider it stealing.
Thank you for coming to my Tedx Talk on bots.
Edit: I just want to say that allowing any user agent with "Pale Moon" or "Goanna" isn't the answer. It's trivially easy to spoof a user agent which is why I worked on fingerprinting it. Changing Pale Moon's user agent to Firefox is likely to cause you problems too. The fork they are using has different fingerprints than an up to date Firefox browser.
Thanks for sharing!
Thanks for reading and commenting!
During my first (shitty) job as a dev outta school, they had me writing scrapers. I was actually able to subvert it pretty easily using this package that doesn't appear to be maintained anymore https://github.com/VeNoMouS/cloudscraper
Was pretty surprised to learn that, at the time, they were only checking if JS was enabled, especially since CF is the gold standard for this sort of stuff. I'm sure this has changed?
Given that the last updates to this repo were five years ago, I'm not too sure if it's still valid. I don't follow Cloudflare bypasses but I am fairly certain there are more successful frameworks and services now. The landscape is evolving quickly. We are seeing a proliferation of "bot as a service", captcha passing farms, dedicated browsers for botting, newsletters, substacks, Discord servers, you name it. Then there are the methods you don't readily find much talk on like custom modified Chrome browsers. It's fascinating how much effort is being funneled into this field.
Oh i can definitely see custom browsers being useful in that area. I remember the JavaScript navigator properties were always such a PITA as there was nothing you could really do to get around what they exposed
There's a whole world of tools you can use that do that for you now. It's easier than ever. To me it's concerning. The level of automation, coupled with a halfway decent LLM, can give you the ability to summon hordes of fake humans to social media. I can't help but think it's why X and Reddit don't use any kind of anti-bot solution.