Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall "Please wait while we check that your Browser is safe" while my laptop goes for a minute or two into full load and screaming hot
Perhaps ending in "We are sorry but we could not verify you are an actual human, your machine shows suspect behaviour, sent an e-mail to admin to get access"
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall True! Well you could at least call someone at O‘reilly and suggest writing a book on that topic

-
@david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."
@mark @david_chisnall Instead of fixing broken code with proper logging and code performance observability, lets stop all the effort and expect Cloudflare to care about actual humans (and not just about their PaaS billing).

-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall
It's also the tens of MByte of Frameworks and JavaScript and ad services that have to be loaded every single time. -
R relay@relay.publicsquare.global shared this topicR relay@relay.an.exchange shared this topic
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall I'd like to automate the process of responding to Cloudflare's checks
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
@david_chisnall why is that there? Bots and AI scraping. None of this would be necessary otherwise.
-
@david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."
@mark @david_chisnall I don't think that's actually the case, at least not entirely. The main issue is that the Internet is currently being inundated with LLM content crawlers to the point that it overwhelms websites or scrapes content some sites don't want sucked into AI training data. It has caused a massive number of sites to serve those bot-detection pages to everyone. So it's not quite an issue of too many visitors but actually "too many non-human visitors"
-
@david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."
This morning, Cloudflare decided that a company I wanted to place an order with shouldn't trust me, so I went to one of their competitors.
-
Web design in the early 2000s: Every 100ms of latency on page load costs visitors.
Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.
On top of all the broken links we’ll send if your not using the proper browser.
-
@david_chisnall why is that there? Bots and AI scraping. None of this would be necessary otherwise.
The thing is, you don't a CAPTCHA. Just three if statements on the server will do it:
If the user agent is chrome, but it didn't send a "Sec-Ch-Ua" header: Send garbage.
If the user agent is a known scraper ("GPTBot", etc): Send garbage.
If the URL is one we generated: Send garbage.
... else: serve the page.
The trick is that instead of blocking them, serve them randomly generated garbage pages.
Each of these pages includes links that will always return garbage. Once these get into the bot's crawler queue, they will be identifiable regardless of how well they hide themselves.
I use this on my site: after a few months, it's 100% effective. Every single scraper request is being blocked. At this point, I could ratelimit the generated URLs, but I enjoy sending them unhinged junk. (... and it's actually cheaper then serving static files!)