Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Pleased to share a page and explainer for the AI tarpit project Science is Poetry, with legal statement, rationale(s), and a few deployment notes:

Pleased to share a page and explainer for the AI tarpit project Science is Poetry, with legal statement, rationale(s), and a few deployment notes:

Scheduled Pinned Locked Moved Uncategorized
bigtech
180 Posts 56 Posters 81 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • themadhatter@mastodon.socialT This user is from outside of this forum
    themadhatter@mastodon.socialT This user is from outside of this forum
    themadhatter@mastodon.social
    wrote last edited by
    #134

    @JulianOliver crazy

    1 Reply Last reply
    0
    • coldclimate@hachyderm.ioC coldclimate@hachyderm.io

      @JulianOliver several, hit me up

      julianoliver@mastodon.socialJ This user is from outside of this forum
      julianoliver@mastodon.socialJ This user is from outside of this forum
      julianoliver@mastodon.social
      wrote last edited by
      #135

      @coldclimate Apologies for the delay. If you're still up for it, here are the 2 records needed:

      A: 95.216.76.85
      AAAA: 2a01:4f9:2b:c83::2

      Let me know once done and I'll set it all up serverside 🙂

      1 Reply Last reply
      0
      • aks@scalie.zoneA aks@scalie.zone

        @JulianOliver i could dedicate subdomains such as science.akselmo.dev to this. Just let me know how.

        julianoliver@mastodon.socialJ This user is from outside of this forum
        julianoliver@mastodon.socialJ This user is from outside of this forum
        julianoliver@mastodon.social
        wrote last edited by
        #136

        @aks Apologies for the delay. If you're still up for it, here are the 2 records needed:

        A: 95.216.76.85
        AAAA: 2a01:4f9:2b:c83::2

        Let me know once done and I'll set it all up serverside 🙂

        1 Reply Last reply
        0
        • thgie@post.lurk.orgT thgie@post.lurk.org

          You can add `dreckiger.schleimpilz.ch` to the list. Thanks for all your work!

          @JulianOliver

          julianoliver@mastodon.socialJ This user is from outside of this forum
          julianoliver@mastodon.socialJ This user is from outside of this forum
          julianoliver@mastodon.social
          wrote last edited by
          #137

          @thgie I still can't get past how perfect this domain is. Thanks again.

          thgie@post.lurk.orgT 1 Reply Last reply
          0
          • julianoliver@mastodon.socialJ This user is from outside of this forum
            julianoliver@mastodon.socialJ This user is from outside of this forum
            julianoliver@mastodon.social
            wrote last edited by
            #138

            @perhammer Apologies for the delay. If you're still up for it, here are the 2 records needed:

            A: 95.216.76.85
            AAAA: 2a01:4f9:2b:c83::2

            Let me know once done and I'll set it all up serverside 🙂

            1 Reply Last reply
            0
            • julianoliver@mastodon.socialJ julianoliver@mastodon.social

              My log analysis shows that what these AI crawlers do is swarm content to get around rate limiting; with many end-points each can be limited to sane human defaults and their automation can still harvest content at massive scales from the same source in little time.

              I noticed however that (for unknown reasons) Anthropic started reducing the number of crawler endpoints, tapering down traffic from them. So I doubled the rate to 2/s. This added over 100k hits to the logs in a day.

              julianoliver@mastodon.socialJ This user is from outside of this forum
              julianoliver@mastodon.socialJ This user is from outside of this forum
              julianoliver@mastodon.social
              wrote last edited by
              #139

              Nearly a month later you would've thought that the crawlers would've given up by now, dropped off, blacklisted the IPs, or perhaps even the domains themselves.

              And yet no. As I tentatively guessed, thanks to your donated domains (and the people linking them in their sites) it has only grown.

              I don't expect it to run this hot for the long term, but yesterday's hit count (these are almost 100% reads of randomly generated pages by AI crawlers) was near 1M.

              Link Preview Image
              retech@corteximplant.comR julianoliver@mastodon.socialJ 2 Replies Last reply
              0
              • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                @thgie I still can't get past how perfect this domain is. Thanks again.

                thgie@post.lurk.orgT This user is from outside of this forum
                thgie@post.lurk.orgT This user is from outside of this forum
                thgie@post.lurk.org
                wrote last edited by
                #140

                I honestly bought the domain on a whim, because I'm kind of fascinated by slime molds. I'm super happy it finds such useful application. Thanks for all your work, @JulianOliver!

                julianoliver@mastodon.socialJ 1 Reply Last reply
                0
                • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                  Nearly a month later you would've thought that the crawlers would've given up by now, dropped off, blacklisted the IPs, or perhaps even the domains themselves.

                  And yet no. As I tentatively guessed, thanks to your donated domains (and the people linking them in their sites) it has only grown.

                  I don't expect it to run this hot for the long term, but yesterday's hit count (these are almost 100% reads of randomly generated pages by AI crawlers) was near 1M.

                  Link Preview Image
                  retech@corteximplant.comR This user is from outside of this forum
                  retech@corteximplant.comR This user is from outside of this forum
                  retech@corteximplant.com
                  wrote last edited by
                  #141

                  @JulianOliver Damn, the bandwidth...

                  1 Reply Last reply
                  0
                  • thgie@post.lurk.orgT thgie@post.lurk.org

                    I honestly bought the domain on a whim, because I'm kind of fascinated by slime molds. I'm super happy it finds such useful application. Thanks for all your work, @JulianOliver!

                    julianoliver@mastodon.socialJ This user is from outside of this forum
                    julianoliver@mastodon.socialJ This user is from outside of this forum
                    julianoliver@mastodon.social
                    wrote last edited by
                    #142

                    @thgie Thanks for the kind words! I'm fascinated by slime molds too. The only kind I don't like comes from Silicon Valley.

                    thgie@post.lurk.orgT 1 Reply Last reply
                    0
                    • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                      @thgie Thanks for the kind words! I'm fascinated by slime molds too. The only kind I don't like comes from Silicon Valley.

                      thgie@post.lurk.orgT This user is from outside of this forum
                      thgie@post.lurk.orgT This user is from outside of this forum
                      thgie@post.lurk.org
                      wrote last edited by
                      #143

                      Exactly, the dirty ones !

                      @JulianOliver

                      1 Reply Last reply
                      0
                      • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                        Nearly a month later you would've thought that the crawlers would've given up by now, dropped off, blacklisted the IPs, or perhaps even the domains themselves.

                        And yet no. As I tentatively guessed, thanks to your donated domains (and the people linking them in their sites) it has only grown.

                        I don't expect it to run this hot for the long term, but yesterday's hit count (these are almost 100% reads of randomly generated pages by AI crawlers) was near 1M.

                        Link Preview Image
                        julianoliver@mastodon.socialJ This user is from outside of this forum
                        julianoliver@mastodon.socialJ This user is from outside of this forum
                        julianoliver@mastodon.social
                        wrote last edited by
                        #144

                        For any naysayers out there as to how effective all this is, or could be, some recent research shows you can do a lot with a little:

                        Link Preview Image
                        Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                        Abstract page for arXiv paper 2510.07192: Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                        favicon

                        arXiv.org (arxiv.org)

                        Researchers found that a very small corpora of poison content has largely the same impact, regardless of the size of the data in the model itself:

                        "We find that 250 poisoned documents similarly compromise models across all model and dataset sizes, despite the largest models training on more than 20 times more clean data."

                        feral_3d@mastodon.socialF liebach@mastodon.artL mgiraldo@mstdn.socialM julianoliver@mastodon.socialJ 4 Replies Last reply
                        0
                        • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                          For any naysayers out there as to how effective all this is, or could be, some recent research shows you can do a lot with a little:

                          Link Preview Image
                          Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                          Abstract page for arXiv paper 2510.07192: Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                          favicon

                          arXiv.org (arxiv.org)

                          Researchers found that a very small corpora of poison content has largely the same impact, regardless of the size of the data in the model itself:

                          "We find that 250 poisoned documents similarly compromise models across all model and dataset sizes, despite the largest models training on more than 20 times more clean data."

                          feral_3d@mastodon.socialF This user is from outside of this forum
                          feral_3d@mastodon.socialF This user is from outside of this forum
                          feral_3d@mastodon.social
                          wrote last edited by
                          #145

                          @JulianOliver oh dang! I kinda love that this is so effective, whereas other methods are completely appropriate. Training season for data is a monopoly, where we to endgender and respect alternatives, industry leaders would find a meaningful new paradigm.

                          1 Reply Last reply
                          0
                          • julianoliver@mastodon.socialJ This user is from outside of this forum
                            julianoliver@mastodon.socialJ This user is from outside of this forum
                            julianoliver@mastodon.social
                            wrote last edited by
                            #146

                            @perhammer Thank you for yours! I will add your domain tomorrow at UTC midnight.

                            If you are up for offering other domains to the cause, that is very kind and good. I'll surely take them. And yes, exactly the same records.

                            I may spin up servers under other IPs in future, and spread the donated domains across them. For now, given the insane volume of traffic, there's evidently no need.

                            1 Reply Last reply
                            0
                            • julianoliver@mastodon.socialJ This user is from outside of this forum
                              julianoliver@mastodon.socialJ This user is from outside of this forum
                              julianoliver@mastodon.social
                              wrote last edited by
                              #147

                              @perhammer Ah such great domains, thank you! I'll report back once done, for you to liberally link.

                              1 Reply Last reply
                              0
                              • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                                For any naysayers out there as to how effective all this is, or could be, some recent research shows you can do a lot with a little:

                                Link Preview Image
                                Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                                Abstract page for arXiv paper 2510.07192: Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                                favicon

                                arXiv.org (arxiv.org)

                                Researchers found that a very small corpora of poison content has largely the same impact, regardless of the size of the data in the model itself:

                                "We find that 250 poisoned documents similarly compromise models across all model and dataset sizes, despite the largest models training on more than 20 times more clean data."

                                liebach@mastodon.artL This user is from outside of this forum
                                liebach@mastodon.artL This user is from outside of this forum
                                liebach@mastodon.art
                                wrote last edited by
                                #148

                                @JulianOliver Heartwarming, inspiring.

                                1 Reply Last reply
                                0
                                • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                                  It's approaching DoS at this point. This just one of the VMs, and just OpenAI's parasite.

                                  Threading's holding up but need some more tuning of rate limits and burst. Trying sending 429's now to ask them to play nice.

                                  To think the www was built for people.

                                  And here we are

                                  bastelwombat@chaos.socialB This user is from outside of this forum
                                  bastelwombat@chaos.socialB This user is from outside of this forum
                                  bastelwombat@chaos.social
                                  wrote last edited by
                                  #149

                                  @JulianOliver Wait, they are still this dumb? Don‘t get me wrong, I like the idea of your project. But I'd expect it to be detected and ignored –* at least by the bigger players. Especially with other projects like this (e.g. Nepenthes) being out for a while already.

                                  Or maybe the detection happens once the content has been parsed? Can you see how many pages deep an individual crawler goes?

                                  * yes, a handmade emdash.

                                  julianoliver@mastodon.socialJ numerfolt@kirche.socialN 2 Replies Last reply
                                  0
                                  • bastelwombat@chaos.socialB bastelwombat@chaos.social

                                    @JulianOliver Wait, they are still this dumb? Don‘t get me wrong, I like the idea of your project. But I'd expect it to be detected and ignored –* at least by the bigger players. Especially with other projects like this (e.g. Nepenthes) being out for a while already.

                                    Or maybe the detection happens once the content has been parsed? Can you see how many pages deep an individual crawler goes?

                                    * yes, a handmade emdash.

                                    julianoliver@mastodon.socialJ This user is from outside of this forum
                                    julianoliver@mastodon.socialJ This user is from outside of this forum
                                    julianoliver@mastodon.social
                                    wrote last edited by
                                    #150

                                    @bastelwombat

                                    Yesterday's hit count for this project was nearly 1M unique page reads, a tiny proportion (<1%) from humans..

                                    I trialed the great Nepenthes quite extensively and it was good at hooking but not holding crawlers, not in 2026, as I explain on the project page. Today the big AI crawlers seemingly lose interest in Markov, tire of drip-fed content, & prefer a non dictionary corpus, as they seek content akin to how we humans communicate (typos, made up words, ad hoc emphasis etc).

                                    1 Reply Last reply
                                    0
                                    • julianoliver@mastodon.socialJ julianoliver@mastodon.social

                                      For any naysayers out there as to how effective all this is, or could be, some recent research shows you can do a lot with a little:

                                      Link Preview Image
                                      Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                                      Abstract page for arXiv paper 2510.07192: Poisoning Attacks on LLMs Require a Near-constant Number of Poison Samples

                                      favicon

                                      arXiv.org (arxiv.org)

                                      Researchers found that a very small corpora of poison content has largely the same impact, regardless of the size of the data in the model itself:

                                      "We find that 250 poisoned documents similarly compromise models across all model and dataset sizes, despite the largest models training on more than 20 times more clean data."

                                      mgiraldo@mstdn.socialM This user is from outside of this forum
                                      mgiraldo@mstdn.socialM This user is from outside of this forum
                                      mgiraldo@mstdn.social
                                      wrote last edited by
                                      #151

                                      @JulianOliver is random data sufficiently poisonous?

                                      julianoliver@mastodon.socialJ 1 Reply Last reply
                                      0
                                      • mgiraldo@mstdn.socialM mgiraldo@mstdn.social

                                        @JulianOliver is random data sufficiently poisonous?

                                        julianoliver@mastodon.socialJ This user is from outside of this forum
                                        julianoliver@mastodon.socialJ This user is from outside of this forum
                                        julianoliver@mastodon.social
                                        wrote last edited by
                                        #152

                                        @mgiraldo Answering that in earnest would require knowing more than I do about the unique model training approaches of each LLM. As a guess it may not be as poisonous as Markov content from well know corpuses like popular books, or famous papers. However some of the bigger bots seem good at detecting this, and so drop-off anyway. I had poor retention results this way.

                                        There may be references, faux terms & partials in randomly produced sentences that could sneak in to training datasets.

                                        mgiraldo@mstdn.socialM 1 Reply Last reply
                                        0
                                        • smn@l3ib.orgS smn@l3ib.org

                                          @JulianOliver done. whatthefuckisgoingonwithmyhorroscope.today now has those records, at least until the domain expires on April 27 2027

                                          julianoliver@mastodon.socialJ This user is from outside of this forum
                                          julianoliver@mastodon.socialJ This user is from outside of this forum
                                          julianoliver@mastodon.social
                                          wrote last edited by
                                          #153

                                          @smn You're live!

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups