Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

Scheduled Pinned Locked Moved Uncategorized
12 Posts 11 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

    Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

    Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

    mark@mastodon.fixermark.comM This user is from outside of this forum
    mark@mastodon.fixermark.comM This user is from outside of this forum
    mark@mastodon.fixermark.com
    wrote last edited by
    #2

    @david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."

    autiomaa@mementomori.socialA danherbert@mastodon.socialD david_chisnall@infosec.exchangeD 3 Replies Last reply
    0
    • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

      Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

      Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

      laberpferd@sueden.socialL This user is from outside of this forum
      laberpferd@sueden.socialL This user is from outside of this forum
      laberpferd@sueden.social
      wrote last edited by
      #3

      @david_chisnall "Please wait while we check that your Browser is safe" while my laptop goes for a minute or two into full load and screaming hot

      Perhaps ending in "We are sorry but we could not verify you are an actual human, your machine shows suspect behaviour, sent an e-mail to admin to get access"

      1 Reply Last reply
      0
      • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

        Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

        Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

        elosha@chaos.socialE This user is from outside of this forum
        elosha@chaos.socialE This user is from outside of this forum
        elosha@chaos.social
        wrote last edited by
        #4

        @david_chisnall True! Well you could at least call someone at O‘reilly and suggest writing a book on that topic 😅

        1 Reply Last reply
        0
        • mark@mastodon.fixermark.comM mark@mastodon.fixermark.com

          @david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."

          autiomaa@mementomori.socialA This user is from outside of this forum
          autiomaa@mementomori.socialA This user is from outside of this forum
          autiomaa@mementomori.social
          wrote last edited by
          #5

          @mark @david_chisnall Instead of fixing broken code with proper logging and code performance observability, lets stop all the effort and expect Cloudflare to care about actual humans (and not just about their PaaS billing). 😓

          1 Reply Last reply
          0
          • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

            Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

            Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

            meilin@tech.lgbtM This user is from outside of this forum
            meilin@tech.lgbtM This user is from outside of this forum
            meilin@tech.lgbt
            wrote last edited by
            #6

            @david_chisnall
            It's also the tens of MByte of Frameworks and JavaScript and ad services that have to be loaded every single time.

            1 Reply Last reply
            0
            • R relay@relay.publicsquare.global shared this topic
              R relay@relay.an.exchange shared this topic
            • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

              Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

              Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

              jackeric@beige.partyJ This user is from outside of this forum
              jackeric@beige.partyJ This user is from outside of this forum
              jackeric@beige.party
              wrote last edited by
              #7

              @david_chisnall I'd like to automate the process of responding to Cloudflare's checks

              1 Reply Last reply
              0
              • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                alexskunz@mas.toA This user is from outside of this forum
                alexskunz@mas.toA This user is from outside of this forum
                alexskunz@mas.to
                wrote last edited by
                #8

                @david_chisnall why is that there? Bots and AI scraping. None of this would be necessary otherwise.

                N 1 Reply Last reply
                0
                • mark@mastodon.fixermark.comM mark@mastodon.fixermark.com

                  @david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."

                  danherbert@mastodon.socialD This user is from outside of this forum
                  danherbert@mastodon.socialD This user is from outside of this forum
                  danherbert@mastodon.social
                  wrote last edited by
                  #9

                  @mark @david_chisnall I don't think that's actually the case, at least not entirely. The main issue is that the Internet is currently being inundated with LLM content crawlers to the point that it overwhelms websites or scrapes content some sites don't want sucked into AI training data. It has caused a massive number of sites to serve those bot-detection pages to everyone. So it's not quite an issue of too many visitors but actually "too many non-human visitors"

                  1 Reply Last reply
                  0
                  • mark@mastodon.fixermark.comM mark@mastodon.fixermark.com

                    @david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."

                    david_chisnall@infosec.exchangeD This user is from outside of this forum
                    david_chisnall@infosec.exchangeD This user is from outside of this forum
                    david_chisnall@infosec.exchange
                    wrote last edited by
                    #10

                    @mark

                    This morning, Cloudflare decided that a company I wanted to place an order with shouldn't trust me, so I went to one of their competitors.

                    1 Reply Last reply
                    0
                    • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                      Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                      Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                      benhm3@saint-paul.usB This user is from outside of this forum
                      benhm3@saint-paul.usB This user is from outside of this forum
                      benhm3@saint-paul.us
                      wrote last edited by
                      #11

                      @david_chisnall

                      On top of all the broken links we’ll send if your not using the proper browser.

                      1 Reply Last reply
                      0
                      • alexskunz@mas.toA alexskunz@mas.to

                        @david_chisnall why is that there? Bots and AI scraping. None of this would be necessary otherwise.

                        N This user is from outside of this forum
                        N This user is from outside of this forum
                        nothacking@infosec.exchange
                        wrote last edited by
                        #12

                        @alexskunz @david_chisnall

                        The thing is, you don't a CAPTCHA. Just three if statements on the server will do it:

                        If the user agent is chrome, but it didn't send a "Sec-Ch-Ua" header: Send garbage.

                        If the user agent is a known scraper ("GPTBot", etc): Send garbage.

                        If the URL is one we generated: Send garbage.

                        ... else: serve the page.

                        The trick is that instead of blocking them, serve them randomly generated garbage pages.

                        Each of these pages includes links that will always return garbage. Once these get into the bot's crawler queue, they will be identifiable regardless of how well they hide themselves.

                        I use this on my site: after a few months, it's 100% effective. Every single scraper request is being blocked. At this point, I could ratelimit the generated URLs, but I enjoy sending them unhinged junk. (... and it's actually cheaper then serving static files!)

                        1 Reply Last reply
                        0
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups