Disgusting and unsurprising.
Most web admins do not care. I’ve lost count of how many sites make me jump through CAPTCHAS or outright block me in private browsing or on VPN. Most of these sites have no sensitive information, or already know exactly who I am because I am already authenticating with my username and password. It’s not something the actual site admins even think about. They click the button, say “it works on my machine!” and will happily blame any user whose client is not dead-center average.
Enter username, but first pass this CAPTCHA.
Enter password, but first pass this second CAPTCHA.
Here’s another CAPTCHA because lol why not?
Some sites even have their RSS feed behind Cloudflare. And guess what that means? It means you can’t fucking load it in a typical RSS reader. Good job!
The web is broken. JavaScript was a mistake. Return to
monkegopher.Fuck Cloudflare.
I get why you’re frustrated and you have every right to be. I’m going to preface what I’m going to say next by saying I work in this industry. I’m not at Cloudflare but I am at a company that provides bot protection. I analyze and block bots for a living. Again, your frustrations are warranted.
-
Even if a site doesn’t have sensitive information, it likely serves a captcha because of the amount of bots that do make requests that are scraping related. The volume of these requests can effectively DDoS them. If they’re selling something, it can disrupt sales. So they lose money on sales and eat the load costs.
-
With more and more username and password leaks, credential stuffing is getting to be a bigger issue than anyone actually realizes. There aren’t really good ways of pinpointing you vs someone that has somehow stolen your credentials. Bots are increasingly more and more sophisticated. Meaning, we see bots using aged sessions which is more in line with human behavior. Most of the companies implementing captcha on login segments do so to try and protect your data and financials.
-
The rise in unique, privacy based browsers is great and it’s also hard to keep up with. It’s been more than six months, but I’ve fingerprinted Pale Moon and, if I recall correctly, it has just enough red flags to be hard to discern between a human and a poorly configured bot.
Ok, enough apologetics. This is a cat and mouse game that the rest of us are being drug into. Sometimes I feel like this is a made up problem. Ultimately, I think this type of thing should be legislated. And before the bot bros jump in and say it’s their right to scrape and take data it’s not. Terms of use are plainly stated by these sites. They consider it stealing.
Thank you for coming to my Tedx Talk on bots.
Edit: I just want to say that allowing any user agent with “Pale Moon” or “Goanna” isn’t the answer. It’s trivially easy to spoof a user agent which is why I worked on fingerprinting it. Changing Pale Moon’s user agent to Firefox is likely to cause you problems too. The fork they are using has different fingerprints than an up to date Firefox browser.
-
These bastards haven’t MITMed half the internet for nothing. This isn’t the first time they abuse that either.
I hate that I once fell for it too when I just started out hosting stuff and put it behind their proxy.
What is MITMed?
“Man in the middle”. They are used by a lot of web services as a proxy, usually to prevent DDOS attacks.
I just won’t use cloudflare, that’s fine.
But everyone else is
deleted by creator