Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. “A chatbot did not kill those children.

“A chatbot did not kill those children.

Scheduled Pinned Locked Moved Uncategorized
techgooglepalantiriraniranwar
6 Posts 6 Posters 9 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • parismarx@mastodon.onlineP This user is from outside of this forum
    parismarx@mastodon.onlineP This user is from outside of this forum
    parismarx@mastodon.online
    wrote last edited by
    #1

    “A chatbot did not kill those children. People failed to update a database, and other people built a system fast enough to make that failure lethal.”

    Maven, the program Google was forced to pull out of and Palantir took over, was the system that identified the girls’ school as a target.

    Link Preview Image
    AI got the blame for the Iran school bombing. The truth is far more worrying

    LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity

    favicon

    the Guardian (www.theguardian.com)

    #tech #google #palantir #iran #iranwar #politics

    kormachameleon@tech.lgbtK osteopenia_powers@newsie.socialO drwho@masto.hackers.townD dmitry@mastodon.circle.ltD kitkat_blue@mastodon.socialK 5 Replies Last reply
    2
    0
    • parismarx@mastodon.onlineP parismarx@mastodon.online

      “A chatbot did not kill those children. People failed to update a database, and other people built a system fast enough to make that failure lethal.”

      Maven, the program Google was forced to pull out of and Palantir took over, was the system that identified the girls’ school as a target.

      Link Preview Image
      AI got the blame for the Iran school bombing. The truth is far more worrying

      LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity

      favicon

      the Guardian (www.theguardian.com)

      #tech #google #palantir #iran #iranwar #politics

      kormachameleon@tech.lgbtK This user is from outside of this forum
      kormachameleon@tech.lgbtK This user is from outside of this forum
      kormachameleon@tech.lgbt
      wrote last edited by
      #2

      @parismarx a country with a historical record for murdering children decided to murder more children.

      1 Reply Last reply
      0
      • parismarx@mastodon.onlineP parismarx@mastodon.online

        “A chatbot did not kill those children. People failed to update a database, and other people built a system fast enough to make that failure lethal.”

        Maven, the program Google was forced to pull out of and Palantir took over, was the system that identified the girls’ school as a target.

        Link Preview Image
        AI got the blame for the Iran school bombing. The truth is far more worrying

        LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity

        favicon

        the Guardian (www.theguardian.com)

        #tech #google #palantir #iran #iranwar #politics

        osteopenia_powers@newsie.socialO This user is from outside of this forum
        osteopenia_powers@newsie.socialO This user is from outside of this forum
        osteopenia_powers@newsie.social
        wrote last edited by
        #3

        @parismarx
        People who allowed a chat bot to choose missile targets in a nation that hasn’t attacked us killed those girls.

        1 Reply Last reply
        0
        • parismarx@mastodon.onlineP parismarx@mastodon.online

          “A chatbot did not kill those children. People failed to update a database, and other people built a system fast enough to make that failure lethal.”

          Maven, the program Google was forced to pull out of and Palantir took over, was the system that identified the girls’ school as a target.

          Link Preview Image
          AI got the blame for the Iran school bombing. The truth is far more worrying

          LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity

          favicon

          the Guardian (www.theguardian.com)

          #tech #google #palantir #iran #iranwar #politics

          drwho@masto.hackers.townD This user is from outside of this forum
          drwho@masto.hackers.townD This user is from outside of this forum
          drwho@masto.hackers.town
          wrote last edited by
          #4

          @parismarx I very much think that the LLM is an excuse for bombing the targets they were going to take out anyway.

          1 Reply Last reply
          0
          • parismarx@mastodon.onlineP parismarx@mastodon.online

            “A chatbot did not kill those children. People failed to update a database, and other people built a system fast enough to make that failure lethal.”

            Maven, the program Google was forced to pull out of and Palantir took over, was the system that identified the girls’ school as a target.

            Link Preview Image
            AI got the blame for the Iran school bombing. The truth is far more worrying

            LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity

            favicon

            the Guardian (www.theguardian.com)

            #tech #google #palantir #iran #iranwar #politics

            dmitry@mastodon.circle.ltD This user is from outside of this forum
            dmitry@mastodon.circle.ltD This user is from outside of this forum
            dmitry@mastodon.circle.lt
            wrote last edited by
            #5

            @parismarx Accountability laundering is not a new problem in tech,¹ but LLMs are making it a thousand times worse because they are specifically designed to mimicry humans and human agency.

            ¹ Remember how Facebook claimed it wasn't their fault Cambridge Analytica collected personal data of hundreds of millions of people and used it to convince entire countries to vote for fascists? They claimed they told CA to delete the data, but let them keep the models trained on it.

            1 Reply Last reply
            0
            • R relay@relay.publicsquare.global shared this topic
            • parismarx@mastodon.onlineP parismarx@mastodon.online

              “A chatbot did not kill those children. People failed to update a database, and other people built a system fast enough to make that failure lethal.”

              Maven, the program Google was forced to pull out of and Palantir took over, was the system that identified the girls’ school as a target.

              Link Preview Image
              AI got the blame for the Iran school bombing. The truth is far more worrying

              LLMs-gone-rogue dominated coverage, but had nothing to do with the targeting. Instead, it was choices made by human beings, over many years, that gave us this atrocity

              favicon

              the Guardian (www.theguardian.com)

              #tech #google #palantir #iran #iranwar #politics

              kitkat_blue@mastodon.socialK This user is from outside of this forum
              kitkat_blue@mastodon.socialK This user is from outside of this forum
              kitkat_blue@mastodon.social
              wrote last edited by
              #6

              @parismarx

              In-depth article detailing how the "kill chain" i.e. target acquisition and mission execution, has degraded in quality due to increased "efficiency" demands--and how that mindset is responsible for this tragedy.

              Link Preview Image
              Kill Chain

              On the automated bureaucratic machinery that killed 175 children

              favicon

              (artificialbureaucracy.substack.com)

              also, this is what we can expect more of (missed and misdirected hits), as "ai" is increasingly integrated into targeting evaluation.

              1 Reply Last reply
              0
              • R relay@relay.mycrowd.ca shared this topic
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


              • Login

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • World
              • Users
              • Groups