Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. TIL that saying "holy shit don't use ChatGPT for medical advice" is a "purity test".

TIL that saying "holy shit don't use ChatGPT for medical advice" is a "purity test".

Scheduled Pinned Locked Moved Uncategorized
27 Posts 18 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • zzt@mas.toZ zzt@mas.to

    @davidgerard I know that alternative medicine has a body count; I’ve seen it in the flesh. I know what some of the horseshit on the Internet can do if you’re very desperate or very trusting.

    the LLM lowers the trust barrier because the crank information is no longer crank flavored, but it’s still dangerous as fuck to follow the advice.

    I keep seeing LLMs be presented as better than nothing and that’s wrong. I wish the people who needed help could get it, but the LLM is worse than nothing.

    zzt@mas.toZ This user is from outside of this forum
    zzt@mas.toZ This user is from outside of this forum
    zzt@mas.to
    wrote last edited by
    #21

    @davidgerard LLMs get alternative medicine patients to the “I don’t care what you say, *I* feel better” point of no return so much quicker because they don’t know it’s alternative medicine. some of it might even be legitimate medicine that works! and all this does is make them less skeptical until they get output that’s plausible but fatal, or until the damage from what they’ve been doing builds up and they can’t survive anymore. and thanks to the LLM, they’ll fight off anyone who tries to help.

    tarmil@mastodon.tarmil.frT 1 Reply Last reply
    0
    • zzt@mas.toZ zzt@mas.to

      @davidgerard LLMs get alternative medicine patients to the “I don’t care what you say, *I* feel better” point of no return so much quicker because they don’t know it’s alternative medicine. some of it might even be legitimate medicine that works! and all this does is make them less skeptical until they get output that’s plausible but fatal, or until the damage from what they’ve been doing builds up and they can’t survive anymore. and thanks to the LLM, they’ll fight off anyone who tries to help.

      tarmil@mastodon.tarmil.frT This user is from outside of this forum
      tarmil@mastodon.tarmil.frT This user is from outside of this forum
      tarmil@mastodon.tarmil.fr
      wrote last edited by
      #22

      @zzt @davidgerard Lies are never more effective than when they're sprinkled with truth, and that's exactly the bread and butter of LLMs: truth-flavoured bullshit.

      1 Reply Last reply
      0
      • jer@chirp.enworld.orgJ jer@chirp.enworld.org

        @wronglang @cstross @davidgerard I actually agree. It would certainly justify the vast amounts of money they make if they had to take personal responsibility for their harmful decisions. Might make them think a little harder about their decisions

        wronglang@bayes.clubW This user is from outside of this forum
        wronglang@bayes.clubW This user is from outside of this forum
        wronglang@bayes.club
        wrote last edited by
        #23

        @Jer @cstross @davidgerard I'm into it and I'm also not sure it's necessary. A corporation is just a bunch of greedy people in a trench coat. If you hurt the board with financial consequences for the company that CEO is going to get hurt in the way the care about the most. The broader problem is that we don't properly enforce consequences for companies at all even when the law is pretty clear.

        cstross@wandering.shopC 1 Reply Last reply
        0
        • davidgerard@circumstances.runD davidgerard@circumstances.run

          TIL that saying "holy shit don't use ChatGPT for medical advice" is a "purity test". i didn't know that before. in fact I still don't.

          davidgerard@circumstances.runD This user is from outside of this forum
          davidgerard@circumstances.runD This user is from outside of this forum
          davidgerard@circumstances.run
          wrote last edited by
          #24

          the person advocating ChatGPT for medical advice was a GNOME developer too

          i'd watch out for signs of GNOME as the next big FOSS project to fill with slop, there's certainly advocates in there

          lu_leipzig@troet.cafeL 1 Reply Last reply
          0
          • davidgerard@circumstances.runD davidgerard@circumstances.run

            the person advocating ChatGPT for medical advice was a GNOME developer too

            i'd watch out for signs of GNOME as the next big FOSS project to fill with slop, there's certainly advocates in there

            lu_leipzig@troet.cafeL This user is from outside of this forum
            lu_leipzig@troet.cafeL This user is from outside of this forum
            lu_leipzig@troet.cafe
            wrote last edited by
            #25

            @davidgerard I'm kinda surprised they haven't already, given their general behaviour over the years. (Other than the inevitable dependency on harfbuzz ofc.)

            1 Reply Last reply
            0
            • davidgerard@circumstances.runD davidgerard@circumstances.run

              TIL that saying "holy shit don't use ChatGPT for medical advice" is a "purity test". i didn't know that before. in fact I still don't.

              floppyplopper@todon.nlF This user is from outside of this forum
              floppyplopper@todon.nlF This user is from outside of this forum
              floppyplopper@todon.nl
              wrote last edited by
              #26

              @davidgerard
              big news for the fans of scott adams dying at least 🤔

              1 Reply Last reply
              0
              • wronglang@bayes.clubW wronglang@bayes.club

                @Jer @cstross @davidgerard I'm into it and I'm also not sure it's necessary. A corporation is just a bunch of greedy people in a trench coat. If you hurt the board with financial consequences for the company that CEO is going to get hurt in the way the care about the most. The broader problem is that we don't properly enforce consequences for companies at all even when the law is pretty clear.

                cstross@wandering.shopC This user is from outside of this forum
                cstross@wandering.shopC This user is from outside of this forum
                cstross@wandering.shop
                wrote last edited by
                #27

                @wronglang @Jer @davidgerard No, the CEO is only hurt *very indirectly* and usually they'll have moved on to another job (with better pay/options) before the pigeons come home to roost. Consider it took more than two decades for the OxyContin scandal to lead to court verdicts, and the Purdue owners still escaped most liability for thousands of deaths by declaring bankruptcy. How many CEOs did Purdue have during that period?

                1 Reply Last reply
                0
                • R relay@relay.infosec.exchange shared this topic
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups