Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit?
-
@Mikal @emilymbender
It's pinky promised by LLM vendors to the healthIT vendors selling to hospitals that data sent to their servers for the purpose of LLM input is not retained or used outside of inference.HealthIT vendors are then happy to believe this at face value so they can sell AI hype to hospital execs desperate to force their clinicians to shove more patients through at any cost and slash labor costs.
Docs are told it's fine by IT who was told it's fine by the CIO who was told...
@Mikal @emilymbender
Just to add a little more here: I listened to the R&D head of a large allegedly monopolistic EHR vendor tell a room full (1000s) of physicians and hospital CXOs they need to be less cautious and go full speed ahead in their adoption of AI.It's being pushed really, really hard in healthcare software.
-
@meltedcheese @emilymbender It’s very likely this feature was introduced into the medical office patient management software. It’s likely being pushed hard by the developers. It might even offer a kickback scenario for the doctors who record the most. Doctors are not going to argue with free money, but they will argue with patients if they stand to lose that kickback money.
This suggests a deeper journalistic dive into that patient mgmt. software might be justified.
@randocity @meltedcheese @emilymbender my medical defence insurer encourages its use because evidence shows lower litigation rates in consultations with scribes. The study was done using human scribes, predates the AI era.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender for a doctors perspective on the more profound side effects of “efficiency”
I Was an Enthusiastic Early Adopter of AI Scribes. Here’s Why I Stopped
A GP reflects on what eighteen months of ambient scribing taught them about the consultation they thought they already understood.
(benngooch.substack.com)
“I felt myself becoming a passive observer in encounters where I had previously been an active architect. I felt my clinical memory, my narrative identity, and my sense of connection to my patients beginning to erode at the edges.”
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender psychiatry did it without informed consent. I am livid
-
@randocity @jrdepriest @emilymbender
I think that kind of depends on things like state laws. California for example is a two party consent state so I think recording someone without asking might actually be a criminal offense. Plus they have to have some sort of device and that is likely to be visible. Either way, I think that's why we need to push back immediately and make sure they understand that this is not acceptable.
@Mikal @jrdepriest @emilymbender The problem is even knowing the doctor’s office recorded the interaction, other than via a whistleblower. If the recording is transcribed and then discarded by the doctor, how would a patient ever know? Once doctors realize they basically can’t get caught doing it, what or who will stop them?
Insurance companies aren’t going to care or even ask if the doctors collected their recordings illegally.
️ -
@randocity @meltedcheese @emilymbender my medical defence insurer encourages its use because evidence shows lower litigation rates in consultations with scribes. The study was done using human scribes, predates the AI era.
@M3L155A @meltedcheese @emilymbender AI hasn’t yet proven itself to be a reliable or useful witness in legal cases. I’m not even sure a lawyer has yet tried using an AI as a witness.
It is possible that having audio recordings of patient interactions could prove useful in courts, but that implies that doctors are being sued more now than in the past.
Insurers don’t derive benefits from AI directly, so I don’t understand this push.
-
@M3L155A @meltedcheese @emilymbender AI hasn’t yet proven itself to be a reliable or useful witness in legal cases. I’m not even sure a lawyer has yet tried using an AI as a witness.
It is possible that having audio recordings of patient interactions could prove useful in courts, but that implies that doctors are being sued more now than in the past.
Insurers don’t derive benefits from AI directly, so I don’t understand this push.
@M3L155A @meltedcheese @emilymbender For clarification, when I say that insurers aren’t deriving benefits from AI directly, I mean specifically the AI that’s being used in doctors offices, learning from patient recordings.
It is very possible, however, that insurance companies are using AI in their own internal systems, but those AI systems are entirely separate from AI used in doctor’s office patient systems.
-
@Mikal @jrdepriest @emilymbender The problem is even knowing the doctor’s office recorded the interaction, other than via a whistleblower. If the recording is transcribed and then discarded by the doctor, how would a patient ever know? Once doctors realize they basically can’t get caught doing it, what or who will stop them?
Insurance companies aren’t going to care or even ask if the doctors collected their recordings illegally.
️@Mikal @jrdepriest @emilymbender The only place where a doctor might get caught at doing it is if they produce a recording in a court of law as part of a legal case. The problem is, patient recordings would be considered part of HIPAA compliance and may be inadmissible if proper procedures are not followed.
There are definitely procedures to follow when introducing HIPAA data into a court case, such as giving the patient a chance to object.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender Funnily enough, transcribing can preserve privacy without issues. Whisper.cpp runs decently well on phones, and can be run on servers that process patient records under the same security constraints. Could easily be run locally even.
Problem is, that’s extremely hard to prove in the current „just slap a gear on it and call it steampunk” climate. I would definitely not trust a random provider.
And if they do „summarization”, forget privacy.
-
@P__X Your experience is your experience, but I am **appalled** at what you're saying about consent here. The fullest version would be too long, so we're not actually doing informed consent? No thank you.
@emilymbender Agreed. For any points that were valid, none of them necessitate the use of LLMs. Never mind without consent. Disgusting.
-
@commonst @emilymbender Medical providers are one to point fingers at patients for being tech naïve. Medical providers, and the medical industry in general, are notoriously the worst at being informed about tech; worse than any industry short of lawyers. That’s actually why HIPAA exists.
@randocity @emilymbender I am in zcanada. No HIPAa, but we do tend to go where the US goes on a lot of things.
-
@P__X Your experience is your experience, but I am **appalled** at what you're saying about consent here. The fullest version would be too long, so we're not actually doing informed consent? No thank you.
@emilymbender "he fullest version would be too long, so we're not actually doing informed consent?"
No, that is not what is being said there. Unlike a blog post, I am restricted in space. I explicitly said that is is a valid concern. A basic research consent form is 8+ pages of legalese and I'm afraid that the future solution will be to add it as a checkbox for 30 pages of text at check-in that nobody reads and doesn't actually inform better. And again, my point #1.
-
@emilymbender Agreed. For any points that were valid, none of them necessitate the use of LLMs. Never mind without consent. Disgusting.
1) Consent is always obtained (and documented). The ideal way/length/detail to do it is up for debate. A 30 page EULA (if this is outsourced to the legal department) will not provide better informed consent, however. The Sutter lawsuit might propel better regulation and policies.
2) Nothing necessitates the use of LLMs. It doesn't mean that it can't be helpful in certain use cases, which I spent my time to point out hoping for a convo and not selective dismissal.
-
@netopwibby Oof -- so she asked if you were okay being recorded but did not provide info on what was going to happen to the recording?
@emilymbender Had a similar experience to
@netopwibby one with a cardiologist, I am in Canada. But at the last appointment she didn’t seem to use it? Will try to think to ask her about the use of it next time if I don’t forget. -
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender
I have-- and refused! -
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender no but in the agreement they ask us to sign periodically it said that they might use AI. So I said I wasn’t signing if they were going to. They asked the doc and she said no I don’t use AI transcription at all and I didn’t know that was in there!
-
@anne_twain @emilymbender I agree, but I imagine it will limit their liability if something happens to my data, intended or not.
I was too "invested" / tired to resist. I don't have an excuse. I will try to do better.
@BoydStephenSmithJr @anne_twain @emilymbender You do have an excuse. You are requesting care.
When I need care, and I am faced with an additional executive function burden, there are three drivers that will push me to accept:
1. They are in a position to refuse me something I need, so I have incentive to accommodate them.
2. I lack the energy to cope with the consequences of refusing. (The "too invested" problem - it takes a lot of energy to interact with medical systems, and when I'm sick, I have less energy to spare.)
3. My ability to cope with decisions is reduced when I need care - the sicker I am, the more I focus on just making it through the next step of the process to obtaining care, and the less externalities matter.The problem isn't you not doing better. The problem is a system set up to make it as hard as possible for you to decline.
And the solution isn't you doing better when you're interacting with the system. The solution is sustained pressure by healthy people when they aren't trying to use the system.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender
One of my first jobs was providing tech support to doctors in a hospital setting. They were some of the most tech-illiterate folks I've ever encountered. They have no concept of operational security.No doctor has ever asked me for permission to store any information about me in whatever systems they're using. For all I know they store it in plain text on an insecure S3 bucket.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender thankfully my therapist was like "yeah dude don't worry about it it's weird" but i still get an email alongside every 'upcoming appointment' email reminding me to sign the permission form
-
@EverydayMoggie @emilymbender That…would honestly scare me more than the initial request, I think. How are you a medical provider and you don’t know what happens when a patient refuses to consent??
@WhiteCatTamer @EverydayMoggie @emilymbender In California, if you refuse, they are legally obligated not to record. California is a two-party consent state. You cannot record anyone's voice for any reason without their consent.