<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Topics tagged with personalsite]]></title><description><![CDATA[A list of topics that have been tagged with personalsite]]></description><link>https://board.circlewithadot.net/tags/personalsite</link><generator>RSS for Node</generator><lastBuildDate>Thu, 14 May 2026 23:15:00 GMT</lastBuildDate><atom:link href="https://board.circlewithadot.net/tags/personalsite.rss" rel="self" type="application/rss+xml"/><pubDate>Invalid Date</pubDate><ttl>60</ttl><item><title><![CDATA[I have an obnoxious problem with crawlers eating bandwidth on my personal web site—not just the fact that crawlers consume so much bandwidth, but rather a behaviour that is absolutely next-level.]]></title><description><![CDATA[@jsstaedtler @redstrate The problem is that lots of crawlers do not respect robots.txt (especially those run by "AI" companies).Thus people go for other solutions, to make it too expensive on the side of the crawler, like iocaine - https://firesphere.dev/articles/iocaine-the-deadliest-poison-known-to-ai, or anubis - https://anubis.techaro.lol]]></description><link>https://board.circlewithadot.net/topic/cf8523fc-acec-47da-9057-83f99ccbd33e/i-have-an-obnoxious-problem-with-crawlers-eating-bandwidth-on-my-personal-web-site-not-just-the-fact-that-crawlers-consume-so-much-bandwidth-but-rather-a-behaviour-that-is-absolutely-next-level.</link><guid isPermaLink="true">https://board.circlewithadot.net/topic/cf8523fc-acec-47da-9057-83f99ccbd33e/i-have-an-obnoxious-problem-with-crawlers-eating-bandwidth-on-my-personal-web-site-not-just-the-fact-that-crawlers-consume-so-much-bandwidth-but-rather-a-behaviour-that-is-absolutely-next-level.</guid><dc:creator><![CDATA[gemelen@mammut.moe]]></dc:creator><pubDate>Invalid Date</pubDate></item></channel></rss>