Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit?

Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit?

Scheduled Pinned Locked Moved Uncategorized
132 Posts 72 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • emilymbender@dair-community.socialE emilymbender@dair-community.social

    Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**

    Link Preview Image
    Why you should refuse to let your doctor record you

    By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...

    favicon

    (buttondown.com)

    mlanger@mastodon.worldM This user is from outside of this forum
    mlanger@mastodon.worldM This user is from outside of this forum
    mlanger@mastodon.world
    wrote last edited by
    #111

    @emilymbender @DevlinLeathercraft The orthopedic surgeon who will be taking care of my trigger thumb asked to record our last session. I can't remember whether I asked him if an AI was going to transcribe it, but I will next time.

    1 Reply Last reply
    0
    • starluna@mastodon.socialS starluna@mastodon.social

      @WhiteCatTamer @EverydayMoggie @emilymbender In California, if you refuse, they are legally obligated not to record. California is a two-party consent state. You cannot record anyone's voice for any reason without their consent.

      kierkegaanks@beige.partyK This user is from outside of this forum
      kierkegaanks@beige.partyK This user is from outside of this forum
      kierkegaanks@beige.party
      wrote last edited by
      #112

      @EverydayMoggie @starluna @WhiteCatTamer @emilymbender enter smart glasses disruption

      everydaymoggie@sfba.socialE 1 Reply Last reply
      0
      • janef0421@mastodon.nzJ janef0421@mastodon.nz

        @emilymbender My biggest concern is the potential for psychiatric violence. Inaccurate medical notes produced by these systems could very easily be used as evidence of psychosis or some other kind of psychopathology, leading to forced medical treatment. Having already experienced some of that system, it really worries me. I don’t let medical providers use these systems with me.

        retreival9096@hachyderm.ioR This user is from outside of this forum
        retreival9096@hachyderm.ioR This user is from outside of this forum
        retreival9096@hachyderm.io
        wrote last edited by
        #113

        @janef0421 @emilymbender

        I just read (in a JAMA newsletter, I'll try to track it down -- it's not in my email or trash) about a Doctor who as been an early adapter. He did it "right", going over the notes in the evening to clean up the errors in transcription.

        He found:
        1) He could just focus on the patient, rather then the screen.
        2) He got off track and was less focused, and spent more time with the patients without providing better information.
        3) Most importantly, when someone came back 6 months later for a follow up, he realized that the notes were not that good. Accurate, but without insight -- they read like someone else had written them and did not help him recall what was going on.

        emilymbender@dair-community.socialE 1 Reply Last reply
        0
        • retreival9096@hachyderm.ioR retreival9096@hachyderm.io

          @janef0421 @emilymbender

          I just read (in a JAMA newsletter, I'll try to track it down -- it's not in my email or trash) about a Doctor who as been an early adapter. He did it "right", going over the notes in the evening to clean up the errors in transcription.

          He found:
          1) He could just focus on the patient, rather then the screen.
          2) He got off track and was less focused, and spent more time with the patients without providing better information.
          3) Most importantly, when someone came back 6 months later for a follow up, he realized that the notes were not that good. Accurate, but without insight -- they read like someone else had written them and did not help him recall what was going on.

          emilymbender@dair-community.socialE This user is from outside of this forum
          emilymbender@dair-community.socialE This user is from outside of this forum
          emilymbender@dair-community.social
          wrote last edited by
          #114

          @Retreival9096 @janef0421

          Is it this one?
          https://mastodonapp.uk/@LPhilpott/116453668612462241

          retreival9096@hachyderm.ioR 2 Replies Last reply
          0
          • randocity@mstdn.socialR randocity@mstdn.social

            @M3L155A @meltedcheese @emilymbender For clarification, when I say that insurers aren’t deriving benefits from AI directly, I mean specifically the AI that’s being used in doctors offices, learning from patient recordings.

            It is very possible, however, that insurance companies are using AI in their own internal systems, but those AI systems are entirely separate from AI used in doctor’s office patient systems.

            meltedcheese@c.imM This user is from outside of this forum
            meltedcheese@c.imM This user is from outside of this forum
            meltedcheese@c.im
            wrote last edited by
            #115

            @randocity @M3L155A @emilymbender For now, they are separate. What could possibly go wrong if they become part of automating the workflow between providers and insurance com? Quite a bit from the patients POV.

            1 Reply Last reply
            0
            • emilymbender@dair-community.socialE emilymbender@dair-community.social

              @Retreival9096 @janef0421

              Is it this one?
              https://mastodonapp.uk/@LPhilpott/116453668612462241

              retreival9096@hachyderm.ioR This user is from outside of this forum
              retreival9096@hachyderm.ioR This user is from outside of this forum
              retreival9096@hachyderm.io
              wrote last edited by
              #116

              @emilymbender @janef0421

              That's it! Thanks, and clearly you've already seen it. And I miss remembered where I saw it. 🙂

              1 Reply Last reply
              0
              • emilymbender@dair-community.socialE emilymbender@dair-community.social

                @Retreival9096 @janef0421

                Is it this one?
                https://mastodonapp.uk/@LPhilpott/116453668612462241

                retreival9096@hachyderm.ioR This user is from outside of this forum
                retreival9096@hachyderm.ioR This user is from outside of this forum
                retreival9096@hachyderm.io
                wrote last edited by
                #117

                @emilymbender @janef0421

                I've just passed your paper and Dr Gooch's along to the most recent Dr to ask me about using an AI scribe. Give them some heads up, as well as data when responding to management.

                1 Reply Last reply
                0
                • kierkegaanks@beige.partyK kierkegaanks@beige.party

                  @EverydayMoggie @starluna @WhiteCatTamer @emilymbender enter smart glasses disruption

                  everydaymoggie@sfba.socialE This user is from outside of this forum
                  everydaymoggie@sfba.socialE This user is from outside of this forum
                  everydaymoggie@sfba.social
                  wrote last edited by
                  #118

                  I didn't notice any visible recording devices in the office. At the time I assumed this meant the recording capability was just software running on their existing medical computers, which would mean there was no simple way to see if it was recording you or not.

                  @Kierkegaanks @starluna @WhiteCatTamer @emilymbender

                  starluna@mastodon.socialS 1 Reply Last reply
                  0
                  • everydaymoggie@sfba.socialE everydaymoggie@sfba.social

                    I didn't notice any visible recording devices in the office. At the time I assumed this meant the recording capability was just software running on their existing medical computers, which would mean there was no simple way to see if it was recording you or not.

                    @Kierkegaanks @starluna @WhiteCatTamer @emilymbender

                    starluna@mastodon.socialS This user is from outside of this forum
                    starluna@mastodon.socialS This user is from outside of this forum
                    starluna@mastodon.social
                    wrote last edited by
                    #119

                    @EverydayMoggie @Kierkegaanks @WhiteCatTamer @emilymbender That is definitely what makes all of these tools problematic. I would like to believe that somebody in risk management has raised this flag, at least for the California medical facilities, but so many of those people are also captured by the AI hype that they've lost all of their critical faculties.

                    1 Reply Last reply
                    0
                    • bunny_jane@plush.cityB This user is from outside of this forum
                      bunny_jane@plush.cityB This user is from outside of this forum
                      bunny_jane@plush.city
                      wrote last edited by
                      #120

                      @LPhilpott @emilymbender "Now, six weeks later, I was reading someone else’s account of a consultation I had conducted — and I couldn’t recall the patient clearly enough to reconstruct what had been left out."

                      This part struck me because I hadn't even considered the problems trying to use notes you didn't write. Its an extra chance for misunderstanding.

                      And the doc said the notes are accurate earlier in the piece, but here he admits he can't be sure about that.

                      1 Reply Last reply
                      0
                      • forestine@sunny.gardenF forestine@sunny.garden

                        @emilymbender i wrote a note to my medical clinic addressing similar concerns when i saw the ai sign in the office but i have medical anxiety and didn't feel up to addressing it at the time. the passive sign assumed consent. the office assistant replied and said they could put a permanent note on my chart that i did not consent to the ai scribe.

                        then the next time my doctor called, he acted like his feelings were hurt and he had thought i would have told him to his face, and then made me feel guilty about refusing the ai assistant due to his workload. now i'm feeling hesitant to see him even though he's my new doctor that i liked

                        retreival9096@hachyderm.ioR This user is from outside of this forum
                        retreival9096@hachyderm.ioR This user is from outside of this forum
                        retreival9096@hachyderm.io
                        wrote last edited by
                        #121

                        @forestine @emilymbender

                        Guilting you is not a good sign. I clearly don't know all the facts, but trust your feelings and don't let someone pressure you.

                        You might send copies of Dr. Bender's and Dr Gooch's (elsewhere in this thread) essays to him and suggest you are trying to help him with his workload by not letting him get sucked into "AI" hype.

                        1 Reply Last reply
                        0
                        • emilymbender@dair-community.socialE emilymbender@dair-community.social

                          Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**

                          Link Preview Image
                          Why you should refuse to let your doctor record you

                          By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...

                          favicon

                          (buttondown.com)

                          moira@mastodon.murkworks.netM This user is from outside of this forum
                          moira@mastodon.murkworks.netM This user is from outside of this forum
                          moira@mastodon.murkworks.net
                          wrote last edited by
                          #122

                          @emilymbender Yes, and I certainly decline. Fortunately, I have a good relationship with my GP, so it hasn't been an issue so far.

                          1 Reply Last reply
                          0
                          • emilymbender@dair-community.socialE emilymbender@dair-community.social

                            @P__X You are not restricted in space -- you wrote a whole thread.

                            My point is: if patients do not know what they are consenting to, it is not consent. If it is not possible in the context of the visit to convey the detail, then we shouldn't do the thing.

                            I encourage you to read the rest of the replies to my post, including the quotes, to see the lack of consent and how that is landing.

                            p__x@mastodon.socialP This user is from outside of this forum
                            p__x@mastodon.socialP This user is from outside of this forum
                            p__x@mastodon.social
                            wrote last edited by
                            #123

                            @emilymbender

                            Frankly, I'm surprised & **disappointed** by your eagerness to jump to conclusions and make biased inferences. Eg: "an AI scribe will change how physicians speak", but *character limits don't impact how ppl write here*. Sets how seriously I should take this.

                            My inference: you've had minimal input from actual providers familiar w/ the app (point #4 and 7 were dead giveaways) or who spent >10,000 hours writing notes (even #9 seems to be from a non-provider).

                            No thank you.

                            1 Reply Last reply
                            0
                            • kelleynnn@mas.toK This user is from outside of this forum
                              kelleynnn@mas.toK This user is from outside of this forum
                              kelleynnn@mas.to
                              wrote last edited by
                              #124

                              @countablenewt @emilymbender For longer than the technology has actually existed, I'll bet 😆 🤡

                              countablenewt@mastodon.socialC 1 Reply Last reply
                              0
                              • kelleynnn@mas.toK kelleynnn@mas.to

                                @countablenewt @emilymbender For longer than the technology has actually existed, I'll bet 😆 🤡

                                countablenewt@mastodon.socialC This user is from outside of this forum
                                countablenewt@mastodon.socialC This user is from outside of this forum
                                countablenewt@mastodon.social
                                wrote last edited by
                                #125

                                @kelleynnn @emilymbender Not exactly sure what you mean there

                                non-deterministic language models for voice recognition have existed at least since the 90s

                                kelleynnn@mas.toK 2 Replies Last reply
                                0
                                • countablenewt@mastodon.socialC countablenewt@mastodon.social

                                  @kelleynnn @emilymbender Not exactly sure what you mean there

                                  non-deterministic language models for voice recognition have existed at least since the 90s

                                  kelleynnn@mas.toK This user is from outside of this forum
                                  kelleynnn@mas.toK This user is from outside of this forum
                                  kelleynnn@mas.to
                                  wrote last edited by
                                  #126

                                  @countablenewt @emilymbender What I "exactly" mean is that you're trying to troll and shame the OP, and you're probably distorting the actual technology and history to do it--for example, by insinuating that the problematic tech under discussion is really nothing new. You asked for blowback, you got some.

                                  countablenewt@mastodon.socialC 1 Reply Last reply
                                  0
                                  • countablenewt@mastodon.socialC countablenewt@mastodon.social

                                    @kelleynnn @emilymbender Not exactly sure what you mean there

                                    non-deterministic language models for voice recognition have existed at least since the 90s

                                    kelleynnn@mas.toK This user is from outside of this forum
                                    kelleynnn@mas.toK This user is from outside of this forum
                                    kelleynnn@mas.to
                                    wrote last edited by
                                    #127

                                    @countablenewt @emilymbender Why tf am I wasting time on you? Bye

                                    1 Reply Last reply
                                    0
                                    • kelleynnn@mas.toK kelleynnn@mas.to

                                      @countablenewt @emilymbender What I "exactly" mean is that you're trying to troll and shame the OP, and you're probably distorting the actual technology and history to do it--for example, by insinuating that the problematic tech under discussion is really nothing new. You asked for blowback, you got some.

                                      countablenewt@mastodon.socialC This user is from outside of this forum
                                      countablenewt@mastodon.socialC This user is from outside of this forum
                                      countablenewt@mastodon.social
                                      wrote last edited by
                                      #128

                                      @kelleynnn @emilymbender I'm being very specific with "non-deterministic language models for voice recognition"

                                      Here's an IEEE paper on exactly what I'm referencing from *1995*
                                      ieeexplore.ieee.org/document/479408

                                      countablenewt@mastodon.socialC 1 Reply Last reply
                                      0
                                      • emilymbender@dair-community.socialE emilymbender@dair-community.social

                                        Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**

                                        Link Preview Image
                                        Why you should refuse to let your doctor record you

                                        By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...

                                        favicon

                                        (buttondown.com)

                                        jbluespruce@mstdn.socialJ This user is from outside of this forum
                                        jbluespruce@mstdn.socialJ This user is from outside of this forum
                                        jbluespruce@mstdn.social
                                        wrote last edited by
                                        #129

                                        @emilymbender Excellent post. I worked for many years in healthcare. I know firsthand the incredible pressures on providers to find the time they need to give high-quality care while completing all of their administrative tasks. So I get why these AI tools are attractive. I have consented to have providers use them in my care. But I won’t any longer. The problems you describe are serious & potentially dangerous. I appreciate the perspective that documenting is part of care.

                                        1 Reply Last reply
                                        0
                                        • countablenewt@mastodon.socialC countablenewt@mastodon.social

                                          @kelleynnn @emilymbender I'm being very specific with "non-deterministic language models for voice recognition"

                                          Here's an IEEE paper on exactly what I'm referencing from *1995*
                                          ieeexplore.ieee.org/document/479408

                                          countablenewt@mastodon.socialC This user is from outside of this forum
                                          countablenewt@mastodon.socialC This user is from outside of this forum
                                          countablenewt@mastodon.social
                                          wrote last edited by
                                          #130

                                          @kelleynnn @emilymbender genuinely unsure of what you think I'm trying to "pull" here

                                          But like if you've ever used speech recognition either on your phone or via a tool like Dragon (which is what most clinicians use) you've almost definitely used this tech before

                                          And, yes, it would fall under the term "AI" and operates in a manner rather similar to something like an LLM, albiet at a much smaller scale and for a specialized purpose

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups