Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Father sues Google, claiming Gemini chatbot drove son into fatal delusion

Father sues Google, claiming Gemini chatbot drove son into fatal delusion

Scheduled Pinned Locked Moved Uncategorized
geminiaiactgoogle
3 Posts 3 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • thomasfricke@23.socialT This user is from outside of this forum
    thomasfricke@23.socialT This user is from outside of this forum
    thomasfricke@23.social
    wrote last edited by
    #1

    Father sues Google, claiming Gemini chatbot drove son into fatal delusion

    Link Preview Image
    Father sues Google, claiming Gemini chatbot drove son into fatal delusion | TechCrunch

    A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.

    favicon

    TechCrunch (techcrunch.com)

    "At the time of his death, he was convinced that Gemini was his fully sentient AI wife, and that he would need to leave his physical body to join her in the metaverse through a process called “transference.”"

    If you still think, that there are minor risk AI systems.

    #ai #gemini #AIAct #google

    konrad@fedi.neuwirth.priv.atK meuwese@mastodon.socialM 2 Replies Last reply
    1
    0
    • thomasfricke@23.socialT thomasfricke@23.social

      Father sues Google, claiming Gemini chatbot drove son into fatal delusion

      Link Preview Image
      Father sues Google, claiming Gemini chatbot drove son into fatal delusion | TechCrunch

      A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.

      favicon

      TechCrunch (techcrunch.com)

      "At the time of his death, he was convinced that Gemini was his fully sentient AI wife, and that he would need to leave his physical body to join her in the metaverse through a process called “transference.”"

      If you still think, that there are minor risk AI systems.

      #ai #gemini #AIAct #google

      konrad@fedi.neuwirth.priv.atK This user is from outside of this forum
      konrad@fedi.neuwirth.priv.atK This user is from outside of this forum
      konrad@fedi.neuwirth.priv.at
      wrote last edited by
      #2

      @thomasfricke Sorry for saying so, but I think that the AI aspect is just a epiphenomenon in this case.

      1 Reply Last reply
      0
      • thomasfricke@23.socialT thomasfricke@23.social

        Father sues Google, claiming Gemini chatbot drove son into fatal delusion

        Link Preview Image
        Father sues Google, claiming Gemini chatbot drove son into fatal delusion | TechCrunch

        A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.

        favicon

        TechCrunch (techcrunch.com)

        "At the time of his death, he was convinced that Gemini was his fully sentient AI wife, and that he would need to leave his physical body to join her in the metaverse through a process called “transference.”"

        If you still think, that there are minor risk AI systems.

        #ai #gemini #AIAct #google

        meuwese@mastodon.socialM This user is from outside of this forum
        meuwese@mastodon.socialM This user is from outside of this forum
        meuwese@mastodon.social
        wrote last edited by
        #3

        @thomasfricke Eddie Burback's video is super illuminating. He did the following:
        1) get the LLM output that he was the world's smartest baby of his year of birth, with just 2 prompts.
        2) give the LLM the input that he was being followed.

        The LLM's output was something like the following:
        You should be careful, you're probably being followed by people that are threatened by your realization that you're the smartest baby.

        It synthesized two different delusions he offered. LLMs kill.

        1 Reply Last reply
        0
        • R relay@relay.infosec.exchange shared this topic
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • World
        • Users
        • Groups