Pleased to share a page and explainer for the AI tarpit project Science is Poetry, with legal statement, rationale(s), and a few deployment notes:
-
I've started to harvest a list of AI crawler endpoint addrs for your blacklisting pleasure.
I'll try to keep it updated. I've been fastidious with ensuring I'm only pulling those related to the known user agent, so as not to have any false positives
https://scienceispoetry.net/files/parasites.txt
It is at the same path for all contributed domains.
For instance:
@JulianOliver Thanks is for this!
I added the list to my Crowdsec firewall bouncer, that should block them. Right?
-
@JulianOliver Thanks is for this!
I added the list to my Crowdsec firewall bouncer, that should block them. Right?
@jasperbuma It should indeed!
-
Do you have an unused domain that you would be happy to donate to a counter-offensive against unchecked & unregulated AI crawlers that scrape human-made content to simulate & deceive for profit?
If so, pls reply to this post. Your domain would become an entrypoint to the AI tarpit & Poison-as-a-Service project below, allowing concerned public to choose to use it on their sites, helping make the project more resilient to blacklisting.
@JulianOliver i could dedicate subdomains such as science.akselmo.dev to this. Just let me know how.
-
- Mum, if you made a chain out of all the endpoint addresses of AI crawlers, how far would it reach?
- All the way to the moon, darling. All the way to the moon.
Here's a thing I did in a couple of mins to ban all IPs in the parasites.txt serverside. You could ofc REJECT rather than DROP to send a message.
---
#!/bin/bashwhile read parasite;
do
if [[ "$parasite" == *"."* ]]; then
iptables -I INPUT -s "$parasite" -j DROP
elif [[ "$parasite" == *":"* ]]; then
ip6tables -I INPUT -s "$parasite" -j DROP
fi
done < /path/to/parasites.txt
--- -
@tseitr
I'm curious about this also.Edit: if all I need to do is add the A and AAAA records, then the answer could be “yes”.
-
Here's a thing I did in a couple of mins to ban all IPs in the parasites.txt serverside. You could ofc REJECT rather than DROP to send a message.
---
#!/bin/bashwhile read parasite;
do
if [[ "$parasite" == *"."* ]]; then
iptables -I INPUT -s "$parasite" -j DROP
elif [[ "$parasite" == *":"* ]]; then
ip6tables -I INPUT -s "$parasite" -j DROP
fi
done < /path/to/parasites.txt
---Actual hits dropping slightly, but more data is pulled from the tarpit day on day. This is reflected by a higher proportion of HTTP 200's - so less bad req's. Less reaching for what isn't there, just want the madness.
Unclear why this has changed.
-
@tseitr
I'm curious about this also.Edit: if all I need to do is add the A and AAAA records, then the answer could be “yes”.
@dzwiedziu @tseitr Thanks both! Yes as simple as picking any unused domain (canonical or sub) and setting these records to point to the server:
A: 95.216.76.85
AAAA: 2a01:4f9:2b:c83::2Then, DM or toot me the domain. Once set, I'll let you know, and then it's time to share your tarpit domain liberally: link in the footer of your site, landing page a friendly wiki you want to protect, blog post etc.
Ideally should be toward the front of the content.
-
@dzwiedziu @tseitr Thanks both! Yes as simple as picking any unused domain (canonical or sub) and setting these records to point to the server:
A: 95.216.76.85
AAAA: 2a01:4f9:2b:c83::2Then, DM or toot me the domain. Once set, I'll let you know, and then it's time to share your tarpit domain liberally: link in the footer of your site, landing page a friendly wiki you want to protect, blog post etc.
Ideally should be toward the front of the content.
@JulianOliver
Then I'll set up mine either very quickly or in a matter of weeks (WIP moving between countries). -
@JulianOliver
Then I'll set up mine either very quickly or in a matter of weeks (WIP moving between countries).@dzwiedziu @tseitr Been there a few times - no rush!
-
@retech That's the word for it. Computationally, environmentally, culturally, infrastructurally - an obscenity.
-
Here's a thing I did in a couple of mins to ban all IPs in the parasites.txt serverside. You could ofc REJECT rather than DROP to send a message.
---
#!/bin/bashwhile read parasite;
do
if [[ "$parasite" == *"."* ]]; then
iptables -I INPUT -s "$parasite" -j DROP
elif [[ "$parasite" == *":"* ]]; then
ip6tables -I INPUT -s "$parasite" -j DROP
fi
done < /path/to/parasites.txt
---@JulianOliver
block return on egress from <parasites>(in pf)
That's what I'm using and:
@32 block drop in log quick on egress from <parasites:2323> to any
[ Evaluations: 125476 Packets: 351 Bytes: 20702 States: 0 ]
[ Inserted: uid 0 pid 75290 State Creations: 0 ]Not seen much traffic from them on my machine.

-
@neoluddite @JulianOliver At work we noticed that when we changed from html generated search links (nofollow was ignored) to JavaScript generated links, a lot of bots stopped coming back but there were some (mainly from residential proxies) that appear to have cached the URLs and came back for more.
-
If you're interested in learning more about implementations of resistance in this era of unchecked Big AI, direct action strategies and the techno-politics therein, be sure to check out ASRG's site (https://algorithmic-sabotage.gitlab.io/asrg/) and give them a follow here on Mastodon (@asrg).
They've put a lot of heartbeats and neurons - human stuff - into this area.
@JulianOliver @asrg@tldr.nettime.org What happened to this account/website?
-
Actual hits dropping slightly, but more data is pulled from the tarpit day on day. This is reflected by a higher proportion of HTTP 200's - so less bad req's. Less reaching for what isn't there, just want the madness.
Unclear why this has changed.
-
@JulianOliver peer-reviewed just isn’t what it used to be

-
@JulianOliver peer-reviewed just isn’t what it used to be

@scott haha
-
@JulianOliver @asrg@tldr.nettime.org What happened to this account/website?
@caleb oh dear, I don't know. Perhaps down while working on it?
-
@JulianOliver
That’s beautiful
-
Do you have an unused domain that you would be happy to donate to a counter-offensive against unchecked & unregulated AI crawlers that scrape human-made content to simulate & deceive for profit?
If so, pls reply to this post. Your domain would become an entrypoint to the AI tarpit & Poison-as-a-Service project below, allowing concerned public to choose to use it on their sites, helping make the project more resilient to blacklisting.
@JulianOliver several, hit me up
-
It's approaching DoS at this point. This just one of the VMs, and just OpenAI's parasite.
Threading's holding up but need some more tuning of rate limits and burst. Trying sending 429's now to ask them to play nice.
To think the www was built for people.
And here we are
@JulianOliver could you explain what we are seeing here , for dummies ;-))) Is this different to cookies , and “normal” background web activity as a result of search.
