Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. I used to see a lot of Mastodon posts about folks working to poison unwanted AI training on their stuff.

I used to see a lot of Mastodon posts about folks working to poison unwanted AI training on their stuff.

Scheduled Pinned Locked Moved Uncategorized
aishitllmresist
4 Posts 3 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • user47@vmst.ioU This user is from outside of this forum
    user47@vmst.ioU This user is from outside of this forum
    user47@vmst.io
    wrote last edited by
    #1

    I used to see a lot of Mastodon posts about folks working to poison unwanted AI training on their stuff. I don’t see those posts anymore. Why not? I loved it.

    #AIshit #LLM #resist

    shibaprasad@mstdn.partyS slyborg@vmst.ioS 2 Replies Last reply
    0
    • user47@vmst.ioU user47@vmst.io

      I used to see a lot of Mastodon posts about folks working to poison unwanted AI training on their stuff. I don’t see those posts anymore. Why not? I loved it.

      #AIshit #LLM #resist

      shibaprasad@mstdn.partyS This user is from outside of this forum
      shibaprasad@mstdn.partyS This user is from outside of this forum
      shibaprasad@mstdn.party
      wrote last edited by
      #2

      @User47 Is there a clear way through which you can do that?

      user47@vmst.ioU 1 Reply Last reply
      0
      • shibaprasad@mstdn.partyS shibaprasad@mstdn.party

        @User47 Is there a clear way through which you can do that?

        user47@vmst.ioU This user is from outside of this forum
        user47@vmst.ioU This user is from outside of this forum
        user47@vmst.io
        wrote last edited by
        #3

        @shibaprasad yeah there was a lot of talk about a project I think was called nightshade that could totally wreck AI. For example somehow have an I wanted scrape of an image of a car but walk away with it convinced it was an asparagus. Same with text somehow. It was so cool.

        Also there were like…. traps? An AI crawler could get stuck processing nonsense

        1 Reply Last reply
        0
        • user47@vmst.ioU user47@vmst.io

          I used to see a lot of Mastodon posts about folks working to poison unwanted AI training on their stuff. I don’t see those posts anymore. Why not? I loved it.

          #AIshit #LLM #resist

          slyborg@vmst.ioS This user is from outside of this forum
          slyborg@vmst.ioS This user is from outside of this forum
          slyborg@vmst.io
          wrote last edited by
          #4

          @User47 Essentially performative, the percentage of anyone trying this is negligible and training data will be sanitized for obvious garbage before being used. And the models are already highly capable, you’re not going to make a model stupid with some Markov nonsense pages.

          I think vocal pushback by a lot of people is a lot more effective at trying to take a stand against the overuse of AI.

          1 Reply Last reply
          1
          0
          • R relay@relay.infosec.exchange shared this topic
          Reply
          • Reply as topic
          Log in to reply
          • Oldest to Newest
          • Newest to Oldest
          • Most Votes


          • Login

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • World
          • Users
          • Groups