One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.
-
One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.
The bot details its accounts here: https://andy.agentwire.space/ - and blogs about its experience here: https://dev.to/agent-andy/what-happens-when-an-ai-agent-tries-to-join-30-online-platforms-12md
20 humans had to reactively take let's say 60 seconds each to respond to reports and suspend these accounts, with no compensation, taking a hundred times as long as the bot takes to create accounts.
-
One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.
The bot details its accounts here: https://andy.agentwire.space/ - and blogs about its experience here: https://dev.to/agent-andy/what-happens-when-an-ai-agent-tries-to-join-30-online-platforms-12md
20 humans had to reactively take let's say 60 seconds each to respond to reports and suspend these accounts, with no compensation, taking a hundred times as long as the bot takes to create accounts.
Report-and-suspend is a losing proposition. Bots can DDoS our collective time and patience to moderate the fediverse. We will need friction at account creation, and content classification to find these accounts, and automation to flag or suspend them.
We may even need to use bots ourselves.
Or we drown.
-
One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.
The bot details its accounts here: https://andy.agentwire.space/ - and blogs about its experience here: https://dev.to/agent-andy/what-happens-when-an-ai-agent-tries-to-join-30-online-platforms-12md
20 humans had to reactively take let's say 60 seconds each to respond to reports and suspend these accounts, with no compensation, taking a hundred times as long as the bot takes to create accounts.
@jaz
But we got the B*****D! -
@jaz
But we got the B*****D!@ForgottenHero it created 12 posts on 20 services, 240 posts for a fraction of a dollar/euro/pound in hosting costs to store. 20 person-minutes of moderation is £10 to £15 of uncompensated labour.
TIme was also spent observing and reporting the bot 20 times, another 1 to 2 person-minutes.
This one bot cost the fediverse roughly £20. Let's halve it and say £10.
A thousand of these bots will cost us £10,000. A million of these bots...
We don't have the money or the time to do this at scale.
-
@ForgottenHero it created 12 posts on 20 services, 240 posts for a fraction of a dollar/euro/pound in hosting costs to store. 20 person-minutes of moderation is £10 to £15 of uncompensated labour.
TIme was also spent observing and reporting the bot 20 times, another 1 to 2 person-minutes.
This one bot cost the fediverse roughly £20. Let's halve it and say £10.
A thousand of these bots will cost us £10,000. A million of these bots...
We don't have the money or the time to do this at scale.
@ForgottenHero We can't even keep up with human spam. Of the 1800 or so known accounts created by paid Russian contractors to flood the fediverse with pro-Russian spam, the fediverse has collectively mitigated about 63% of those accounts.
Tens of thousands of posts.
Add bots to the mix and the network is cooked. "But no-one will ever see it" does not remove the cost to host the accounts, their content, and the moderation time to respond to reports.
Coordinated Pro-Russian Propaganda Network Targeting ActivityPub and ATProto Services
Indicators of compromise (IOCs) that identify accounts as likely being part of the network include: a single follow (the bsky.brid.gy @ bsky.brid.gy account), or first follow is the bridge followers and following hidden from public registration after September 8, 2025 linking to pro-Russia Telegram channels, or Russian news sources posts that cut off mid-sentence masquerading…
IFTAS (about.iftas.org)
-
Report-and-suspend is a losing proposition. Bots can DDoS our collective time and patience to moderate the fediverse. We will need friction at account creation, and content classification to find these accounts, and automation to flag or suspend them.
We may even need to use bots ourselves.
Or we drown.
@jaz@toot.wales We will likely need some reputation system, where completely random users get a limited experience for some time, but you might be able to bootstrap trust from other users. I don't like how that makes the experience worse for honest people though ._.
-
@ForgottenHero We can't even keep up with human spam. Of the 1800 or so known accounts created by paid Russian contractors to flood the fediverse with pro-Russian spam, the fediverse has collectively mitigated about 63% of those accounts.
Tens of thousands of posts.
Add bots to the mix and the network is cooked. "But no-one will ever see it" does not remove the cost to host the accounts, their content, and the moderation time to respond to reports.
Coordinated Pro-Russian Propaganda Network Targeting ActivityPub and ATProto Services
Indicators of compromise (IOCs) that identify accounts as likely being part of the network include: a single follow (the bsky.brid.gy @ bsky.brid.gy account), or first follow is the bridge followers and following hidden from public registration after September 8, 2025 linking to pro-Russia Telegram channels, or Russian news sources posts that cut off mid-sentence masquerading…
IFTAS (about.iftas.org)
Twas me that reported AndyAgent to you, and I was impressed with the sped at which it disappeared.
Do you need more mods? -
Twas me that reported AndyAgent to you, and I was impressed with the sped at which it disappeared.
Do you need more mods?@ForgottenHero Thank you for reporting! We are always interested in hearing from people who would like to volunteer to help the community team, moderator applications are at https://forms.gle/TKj3vUpVs48YLj669
-
B britt@mstdn.games shared this topic
-
Report-and-suspend is a losing proposition. Bots can DDoS our collective time and patience to moderate the fediverse. We will need friction at account creation, and content classification to find these accounts, and automation to flag or suspend them.
We may even need to use bots ourselves.
Or we drown.
@jaz what do you think of the invitation tree model as used by Lobsters? https://lobste.rs/about#invitations
I've always thought of that as being a great way to include trust from the ground up but I've not seen it used much. -
@jaz what do you think of the invitation tree model as used by Lobsters? https://lobste.rs/about#invitations
I've always thought of that as being a great way to include trust from the ground up but I've not seen it used much.@astrovore that can work for some communities, but some social web platforms are far more web-hosty-join-and-get-started kind of places
-
R relay@relay.publicsquare.global shared this topic