Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit?
-
@WhiteCatTamer @EverydayMoggie @emilymbender In California, if you refuse, they are legally obligated not to record. California is a two-party consent state. You cannot record anyone's voice for any reason without their consent.
@EverydayMoggie @starluna @WhiteCatTamer @emilymbender enter smart glasses disruption
-
@emilymbender My biggest concern is the potential for psychiatric violence. Inaccurate medical notes produced by these systems could very easily be used as evidence of psychosis or some other kind of psychopathology, leading to forced medical treatment. Having already experienced some of that system, it really worries me. I don’t let medical providers use these systems with me.
I just read (in a JAMA newsletter, I'll try to track it down -- it's not in my email or trash) about a Doctor who as been an early adapter. He did it "right", going over the notes in the evening to clean up the errors in transcription.
He found:
1) He could just focus on the patient, rather then the screen.
2) He got off track and was less focused, and spent more time with the patients without providing better information.
3) Most importantly, when someone came back 6 months later for a follow up, he realized that the notes were not that good. Accurate, but without insight -- they read like someone else had written them and did not help him recall what was going on. -
I just read (in a JAMA newsletter, I'll try to track it down -- it's not in my email or trash) about a Doctor who as been an early adapter. He did it "right", going over the notes in the evening to clean up the errors in transcription.
He found:
1) He could just focus on the patient, rather then the screen.
2) He got off track and was less focused, and spent more time with the patients without providing better information.
3) Most importantly, when someone came back 6 months later for a follow up, he realized that the notes were not that good. Accurate, but without insight -- they read like someone else had written them and did not help him recall what was going on. -
@M3L155A @meltedcheese @emilymbender For clarification, when I say that insurers aren’t deriving benefits from AI directly, I mean specifically the AI that’s being used in doctors offices, learning from patient recordings.
It is very possible, however, that insurance companies are using AI in their own internal systems, but those AI systems are entirely separate from AI used in doctor’s office patient systems.
@randocity @M3L155A @emilymbender For now, they are separate. What could possibly go wrong if they become part of automating the workflow between providers and insurance com? Quite a bit from the patients POV.
-
That's it! Thanks, and clearly you've already seen it. And I miss remembered where I saw it.

-
I've just passed your paper and Dr Gooch's along to the most recent Dr to ask me about using an AI scribe. Give them some heads up, as well as data when responding to management.
-
@EverydayMoggie @starluna @WhiteCatTamer @emilymbender enter smart glasses disruption
I didn't notice any visible recording devices in the office. At the time I assumed this meant the recording capability was just software running on their existing medical computers, which would mean there was no simple way to see if it was recording you or not.
-
I didn't notice any visible recording devices in the office. At the time I assumed this meant the recording capability was just software running on their existing medical computers, which would mean there was no simple way to see if it was recording you or not.
@EverydayMoggie @Kierkegaanks @WhiteCatTamer @emilymbender That is definitely what makes all of these tools problematic. I would like to believe that somebody in risk management has raised this flag, at least for the California medical facilities, but so many of those people are also captured by the AI hype that they've lost all of their critical faculties.
-
@LPhilpott @emilymbender "Now, six weeks later, I was reading someone else’s account of a consultation I had conducted — and I couldn’t recall the patient clearly enough to reconstruct what had been left out."
This part struck me because I hadn't even considered the problems trying to use notes you didn't write. Its an extra chance for misunderstanding.
And the doc said the notes are accurate earlier in the piece, but here he admits he can't be sure about that.
-
@emilymbender i wrote a note to my medical clinic addressing similar concerns when i saw the ai sign in the office but i have medical anxiety and didn't feel up to addressing it at the time. the passive sign assumed consent. the office assistant replied and said they could put a permanent note on my chart that i did not consent to the ai scribe.
then the next time my doctor called, he acted like his feelings were hurt and he had thought i would have told him to his face, and then made me feel guilty about refusing the ai assistant due to his workload. now i'm feeling hesitant to see him even though he's my new doctor that i liked
Guilting you is not a good sign. I clearly don't know all the facts, but trust your feelings and don't let someone pressure you.
You might send copies of Dr. Bender's and Dr Gooch's (elsewhere in this thread) essays to him and suggest you are trying to help him with his workload by not letting him get sucked into "AI" hype.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender Yes, and I certainly decline. Fortunately, I have a good relationship with my GP, so it hasn't been an issue so far.
-
@P__X You are not restricted in space -- you wrote a whole thread.
My point is: if patients do not know what they are consenting to, it is not consent. If it is not possible in the context of the visit to convey the detail, then we shouldn't do the thing.
I encourage you to read the rest of the replies to my post, including the quotes, to see the lack of consent and how that is landing.
Frankly, I'm surprised & **disappointed** by your eagerness to jump to conclusions and make biased inferences. Eg: "an AI scribe will change how physicians speak", but *character limits don't impact how ppl write here*. Sets how seriously I should take this.
My inference: you've had minimal input from actual providers familiar w/ the app (point #4 and 7 were dead giveaways) or who spent >10,000 hours writing notes (even #9 seems to be from a non-provider).
No thank you.
-
@countablenewt @emilymbender For longer than the technology has actually existed, I'll bet

-
@countablenewt @emilymbender For longer than the technology has actually existed, I'll bet

@kelleynnn @emilymbender Not exactly sure what you mean there
non-deterministic language models for voice recognition have existed at least since the 90s
-
@kelleynnn @emilymbender Not exactly sure what you mean there
non-deterministic language models for voice recognition have existed at least since the 90s
@countablenewt @emilymbender What I "exactly" mean is that you're trying to troll and shame the OP, and you're probably distorting the actual technology and history to do it--for example, by insinuating that the problematic tech under discussion is really nothing new. You asked for blowback, you got some.
-
@kelleynnn @emilymbender Not exactly sure what you mean there
non-deterministic language models for voice recognition have existed at least since the 90s
@countablenewt @emilymbender Why tf am I wasting time on you? Bye
-
@countablenewt @emilymbender What I "exactly" mean is that you're trying to troll and shame the OP, and you're probably distorting the actual technology and history to do it--for example, by insinuating that the problematic tech under discussion is really nothing new. You asked for blowback, you got some.
@kelleynnn @emilymbender I'm being very specific with "non-deterministic language models for voice recognition"
Here's an IEEE paper on exactly what I'm referencing from *1995*
ieeexplore.ieee.org/document/479408 -
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender Excellent post. I worked for many years in healthcare. I know firsthand the incredible pressures on providers to find the time they need to give high-quality care while completing all of their administrative tasks. So I get why these AI tools are attractive. I have consented to have providers use them in my care. But I won’t any longer. The problems you describe are serious & potentially dangerous. I appreciate the perspective that documenting is part of care.
-
@kelleynnn @emilymbender I'm being very specific with "non-deterministic language models for voice recognition"
Here's an IEEE paper on exactly what I'm referencing from *1995*
ieeexplore.ieee.org/document/479408@kelleynnn @emilymbender genuinely unsure of what you think I'm trying to "pull" here
But like if you've ever used speech recognition either on your phone or via a tool like Dragon (which is what most clinicians use) you've almost definitely used this tech before
And, yes, it would fall under the term "AI" and operates in a manner rather similar to something like an LLM, albiet at a much smaller scale and for a specialized purpose
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender I appreciate this information. My PCP is part of Concord Hospital in NH and he asked me at my last visit if I was ok with it.
I asked if Elon Musk or any other fascist would have access to it and he laughed and said he wouldn’t use it if it did. He claimed it’s only in their internal system so I said ok.
But don’t the providers prefer the human scribe because they also serve as a chaperone/witness?