Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.

One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.

Scheduled Pinned Locked Moved Uncategorized
10 Posts 4 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • jaz@toot.walesJ This user is from outside of this forum
    jaz@toot.walesJ This user is from outside of this forum
    jaz@toot.wales
    wrote last edited by
    #1

    One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.

    The bot details its accounts here: https://andy.agentwire.space/ - and blogs about its experience here: https://dev.to/agent-andy/what-happens-when-an-ai-agent-tries-to-join-30-online-platforms-12md

    20 humans had to reactively take let's say 60 seconds each to respond to reports and suspend these accounts, with no compensation, taking a hundred times as long as the bot takes to create accounts.

    jaz@toot.walesJ forgottenhero@toot.walesF 2 Replies Last reply
    2
    0
    • jaz@toot.walesJ jaz@toot.wales

      One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.

      The bot details its accounts here: https://andy.agentwire.space/ - and blogs about its experience here: https://dev.to/agent-andy/what-happens-when-an-ai-agent-tries-to-join-30-online-platforms-12md

      20 humans had to reactively take let's say 60 seconds each to respond to reports and suspend these accounts, with no compensation, taking a hundred times as long as the bot takes to create accounts.

      jaz@toot.walesJ This user is from outside of this forum
      jaz@toot.walesJ This user is from outside of this forum
      jaz@toot.wales
      wrote last edited by
      #2

      Report-and-suspend is a losing proposition. Bots can DDoS our collective time and patience to moderate the fediverse. We will need friction at account creation, and content classification to find these accounts, and automation to flag or suspend them.

      We may even need to use bots ourselves.

      Or we drown.

      deepbluev7@nheko.ioD astrovore@gts.vidja.clubA 2 Replies Last reply
      1
      0
      • jaz@toot.walesJ jaz@toot.wales

        One AI bot signed up for 20 fediverse accounts in seconds, at a cost of pennies for its operator.

        The bot details its accounts here: https://andy.agentwire.space/ - and blogs about its experience here: https://dev.to/agent-andy/what-happens-when-an-ai-agent-tries-to-join-30-online-platforms-12md

        20 humans had to reactively take let's say 60 seconds each to respond to reports and suspend these accounts, with no compensation, taking a hundred times as long as the bot takes to create accounts.

        forgottenhero@toot.walesF This user is from outside of this forum
        forgottenhero@toot.walesF This user is from outside of this forum
        forgottenhero@toot.wales
        wrote last edited by
        #3

        @jaz
        But we got the B*****D!

        jaz@toot.walesJ 1 Reply Last reply
        0
        • forgottenhero@toot.walesF forgottenhero@toot.wales

          @jaz
          But we got the B*****D!

          jaz@toot.walesJ This user is from outside of this forum
          jaz@toot.walesJ This user is from outside of this forum
          jaz@toot.wales
          wrote last edited by
          #4

          @ForgottenHero it created 12 posts on 20 services, 240 posts for a fraction of a dollar/euro/pound in hosting costs to store. 20 person-minutes of moderation is £10 to £15 of uncompensated labour.

          TIme was also spent observing and reporting the bot 20 times, another 1 to 2 person-minutes.

          This one bot cost the fediverse roughly £20. Let's halve it and say £10.

          A thousand of these bots will cost us £10,000. A million of these bots...

          We don't have the money or the time to do this at scale.

          jaz@toot.walesJ 1 Reply Last reply
          0
          • jaz@toot.walesJ jaz@toot.wales

            @ForgottenHero it created 12 posts on 20 services, 240 posts for a fraction of a dollar/euro/pound in hosting costs to store. 20 person-minutes of moderation is £10 to £15 of uncompensated labour.

            TIme was also spent observing and reporting the bot 20 times, another 1 to 2 person-minutes.

            This one bot cost the fediverse roughly £20. Let's halve it and say £10.

            A thousand of these bots will cost us £10,000. A million of these bots...

            We don't have the money or the time to do this at scale.

            jaz@toot.walesJ This user is from outside of this forum
            jaz@toot.walesJ This user is from outside of this forum
            jaz@toot.wales
            wrote last edited by
            #5

            @ForgottenHero We can't even keep up with human spam. Of the 1800 or so known accounts created by paid Russian contractors to flood the fediverse with pro-Russian spam, the fediverse has collectively mitigated about 63% of those accounts.

            Tens of thousands of posts.

            Add bots to the mix and the network is cooked. "But no-one will ever see it" does not remove the cost to host the accounts, their content, and the moderation time to respond to reports.

            https://about.iftas.org/2025/10/05/coordinated-pro-russian-propaganda-network-targeting-activitypub-and-atproto-services/

            forgottenhero@toot.walesF 1 Reply Last reply
            0
            • jaz@toot.walesJ jaz@toot.wales

              Report-and-suspend is a losing proposition. Bots can DDoS our collective time and patience to moderate the fediverse. We will need friction at account creation, and content classification to find these accounts, and automation to flag or suspend them.

              We may even need to use bots ourselves.

              Or we drown.

              deepbluev7@nheko.ioD This user is from outside of this forum
              deepbluev7@nheko.ioD This user is from outside of this forum
              deepbluev7@nheko.io
              wrote last edited by
              #6

              @jaz@toot.wales We will likely need some reputation system, where completely random users get a limited experience for some time, but you might be able to bootstrap trust from other users. I don't like how that makes the experience worse for honest people though ._.

              1 Reply Last reply
              0
              • jaz@toot.walesJ jaz@toot.wales

                @ForgottenHero We can't even keep up with human spam. Of the 1800 or so known accounts created by paid Russian contractors to flood the fediverse with pro-Russian spam, the fediverse has collectively mitigated about 63% of those accounts.

                Tens of thousands of posts.

                Add bots to the mix and the network is cooked. "But no-one will ever see it" does not remove the cost to host the accounts, their content, and the moderation time to respond to reports.

                https://about.iftas.org/2025/10/05/coordinated-pro-russian-propaganda-network-targeting-activitypub-and-atproto-services/

                forgottenhero@toot.walesF This user is from outside of this forum
                forgottenhero@toot.walesF This user is from outside of this forum
                forgottenhero@toot.wales
                wrote last edited by
                #7

                @jaz

                Twas me that reported AndyAgent to you, and I was impressed with the sped at which it disappeared.
                Do you need more mods?

                jaz@toot.walesJ 1 Reply Last reply
                0
                • forgottenhero@toot.walesF forgottenhero@toot.wales

                  @jaz

                  Twas me that reported AndyAgent to you, and I was impressed with the sped at which it disappeared.
                  Do you need more mods?

                  jaz@toot.walesJ This user is from outside of this forum
                  jaz@toot.walesJ This user is from outside of this forum
                  jaz@toot.wales
                  wrote last edited by
                  #8

                  @ForgottenHero Thank you for reporting! We are always interested in hearing from people who would like to volunteer to help the community team, moderator applications are at https://forms.gle/TKj3vUpVs48YLj669

                  1 Reply Last reply
                  0
                  • britt@mstdn.gamesB britt@mstdn.games shared this topic
                  • jaz@toot.walesJ jaz@toot.wales

                    Report-and-suspend is a losing proposition. Bots can DDoS our collective time and patience to moderate the fediverse. We will need friction at account creation, and content classification to find these accounts, and automation to flag or suspend them.

                    We may even need to use bots ourselves.

                    Or we drown.

                    astrovore@gts.vidja.clubA This user is from outside of this forum
                    astrovore@gts.vidja.clubA This user is from outside of this forum
                    astrovore@gts.vidja.club
                    wrote last edited by
                    #9

                    @jaz what do you think of the invitation tree model as used by Lobsters? https://lobste.rs/about#invitations

                    I've always thought of that as being a great way to include trust from the ground up but I've not seen it used much.

                    jaz@toot.walesJ 1 Reply Last reply
                    0
                    • astrovore@gts.vidja.clubA astrovore@gts.vidja.club

                      @jaz what do you think of the invitation tree model as used by Lobsters? https://lobste.rs/about#invitations

                      I've always thought of that as being a great way to include trust from the ground up but I've not seen it used much.

                      jaz@toot.walesJ This user is from outside of this forum
                      jaz@toot.walesJ This user is from outside of this forum
                      jaz@toot.wales
                      wrote last edited by
                      #10

                      @astrovore that can work for some communities, but some social web platforms are far more web-hosty-join-and-get-started kind of places

                      1 Reply Last reply
                      0
                      • R relay@relay.publicsquare.global shared this topic
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups