Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit?
-
@P__X You are not restricted in space -- you wrote a whole thread.
My point is: if patients do not know what they are consenting to, it is not consent. If it is not possible in the context of the visit to convey the detail, then we shouldn't do the thing.
I encourage you to read the rest of the replies to my post, including the quotes, to see the lack of consent and how that is landing.
Frankly, I'm surprised & **disappointed** by your eagerness to jump to conclusions and make biased inferences. Eg: "an AI scribe will change how physicians speak", but *character limits don't impact how ppl write here*. Sets how seriously I should take this.
My inference: you've had minimal input from actual providers familiar w/ the app (point #4 and 7 were dead giveaways) or who spent >10,000 hours writing notes (even #9 seems to be from a non-provider).
No thank you.
-
@countablenewt @emilymbender For longer than the technology has actually existed, I'll bet

-
@countablenewt @emilymbender For longer than the technology has actually existed, I'll bet

@kelleynnn @emilymbender Not exactly sure what you mean there
non-deterministic language models for voice recognition have existed at least since the 90s
-
@kelleynnn @emilymbender Not exactly sure what you mean there
non-deterministic language models for voice recognition have existed at least since the 90s
@countablenewt @emilymbender What I "exactly" mean is that you're trying to troll and shame the OP, and you're probably distorting the actual technology and history to do it--for example, by insinuating that the problematic tech under discussion is really nothing new. You asked for blowback, you got some.
-
@kelleynnn @emilymbender Not exactly sure what you mean there
non-deterministic language models for voice recognition have existed at least since the 90s
@countablenewt @emilymbender Why tf am I wasting time on you? Bye
-
@countablenewt @emilymbender What I "exactly" mean is that you're trying to troll and shame the OP, and you're probably distorting the actual technology and history to do it--for example, by insinuating that the problematic tech under discussion is really nothing new. You asked for blowback, you got some.
@kelleynnn @emilymbender I'm being very specific with "non-deterministic language models for voice recognition"
Here's an IEEE paper on exactly what I'm referencing from *1995*
ieeexplore.ieee.org/document/479408 -
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender Excellent post. I worked for many years in healthcare. I know firsthand the incredible pressures on providers to find the time they need to give high-quality care while completing all of their administrative tasks. So I get why these AI tools are attractive. I have consented to have providers use them in my care. But I won’t any longer. The problems you describe are serious & potentially dangerous. I appreciate the perspective that documenting is part of care.
-
@kelleynnn @emilymbender I'm being very specific with "non-deterministic language models for voice recognition"
Here's an IEEE paper on exactly what I'm referencing from *1995*
ieeexplore.ieee.org/document/479408@kelleynnn @emilymbender genuinely unsure of what you think I'm trying to "pull" here
But like if you've ever used speech recognition either on your phone or via a tool like Dragon (which is what most clinicians use) you've almost definitely used this tech before
And, yes, it would fall under the term "AI" and operates in a manner rather similar to something like an LLM, albiet at a much smaller scale and for a specialized purpose
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender I appreciate this information. My PCP is part of Concord Hospital in NH and he asked me at my last visit if I was ok with it.
I asked if Elon Musk or any other fascist would have access to it and he laughed and said he wouldn’t use it if it did. He claimed it’s only in their internal system so I said ok.
But don’t the providers prefer the human scribe because they also serve as a chaperone/witness?
-
So far, I've been able to politely decline. Not sure how long that will last.
@jrdepriest @emilymbender I actually suspect that Dartmouth Hitchcock in New Hampshire does it without asking. A couple years ago I looked at the notes after a gynecology visit and some of the things she said I said were accurate but the specific words she used were weird.
Like it said “patient states that these symptoms began when she was a little girl”. I never use “little girl” when describing myself as a child. I said “kid”. Maybe I’m being overly suspicious but it seems like a computer at a women’s health visit would be more likely to change “kid” to “little girl” than a grown ass gynecologist would.
-
R relay@relay.mycrowd.ca shared this topic