Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit?
-
@M3L155A @meltedcheese @emilymbender AI hasn’t yet proven itself to be a reliable or useful witness in legal cases. I’m not even sure a lawyer has yet tried using an AI as a witness.
It is possible that having audio recordings of patient interactions could prove useful in courts, but that implies that doctors are being sued more now than in the past.
Insurers don’t derive benefits from AI directly, so I don’t understand this push.
@M3L155A @meltedcheese @emilymbender For clarification, when I say that insurers aren’t deriving benefits from AI directly, I mean specifically the AI that’s being used in doctors offices, learning from patient recordings.
It is very possible, however, that insurance companies are using AI in their own internal systems, but those AI systems are entirely separate from AI used in doctor’s office patient systems.
-
@Mikal @jrdepriest @emilymbender The problem is even knowing the doctor’s office recorded the interaction, other than via a whistleblower. If the recording is transcribed and then discarded by the doctor, how would a patient ever know? Once doctors realize they basically can’t get caught doing it, what or who will stop them?
Insurance companies aren’t going to care or even ask if the doctors collected their recordings illegally.
️@Mikal @jrdepriest @emilymbender The only place where a doctor might get caught at doing it is if they produce a recording in a court of law as part of a legal case. The problem is, patient recordings would be considered part of HIPAA compliance and may be inadmissible if proper procedures are not followed.
There are definitely procedures to follow when introducing HIPAA data into a court case, such as giving the patient a chance to object.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender Funnily enough, transcribing can preserve privacy without issues. Whisper.cpp runs decently well on phones, and can be run on servers that process patient records under the same security constraints. Could easily be run locally even.
Problem is, that’s extremely hard to prove in the current „just slap a gear on it and call it steampunk” climate. I would definitely not trust a random provider.
And if they do „summarization”, forget privacy.
-
@P__X Your experience is your experience, but I am **appalled** at what you're saying about consent here. The fullest version would be too long, so we're not actually doing informed consent? No thank you.
@emilymbender Agreed. For any points that were valid, none of them necessitate the use of LLMs. Never mind without consent. Disgusting.
-
@commonst @emilymbender Medical providers are one to point fingers at patients for being tech naïve. Medical providers, and the medical industry in general, are notoriously the worst at being informed about tech; worse than any industry short of lawyers. That’s actually why HIPAA exists.
@randocity @emilymbender I am in zcanada. No HIPAa, but we do tend to go where the US goes on a lot of things.
-
@P__X Your experience is your experience, but I am **appalled** at what you're saying about consent here. The fullest version would be too long, so we're not actually doing informed consent? No thank you.
@emilymbender "he fullest version would be too long, so we're not actually doing informed consent?"
No, that is not what is being said there. Unlike a blog post, I am restricted in space. I explicitly said that is is a valid concern. A basic research consent form is 8+ pages of legalese and I'm afraid that the future solution will be to add it as a checkbox for 30 pages of text at check-in that nobody reads and doesn't actually inform better. And again, my point #1.
-
@emilymbender Agreed. For any points that were valid, none of them necessitate the use of LLMs. Never mind without consent. Disgusting.
1) Consent is always obtained (and documented). The ideal way/length/detail to do it is up for debate. A 30 page EULA (if this is outsourced to the legal department) will not provide better informed consent, however. The Sutter lawsuit might propel better regulation and policies.
2) Nothing necessitates the use of LLMs. It doesn't mean that it can't be helpful in certain use cases, which I spent my time to point out hoping for a convo and not selective dismissal.
-
@netopwibby Oof -- so she asked if you were okay being recorded but did not provide info on what was going to happen to the recording?
@emilymbender Had a similar experience to
@netopwibby one with a cardiologist, I am in Canada. But at the last appointment she didn’t seem to use it? Will try to think to ask her about the use of it next time if I don’t forget. -
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender
I have-- and refused! -
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender no but in the agreement they ask us to sign periodically it said that they might use AI. So I said I wasn’t signing if they were going to. They asked the doc and she said no I don’t use AI transcription at all and I didn’t know that was in there!
-
@anne_twain @emilymbender I agree, but I imagine it will limit their liability if something happens to my data, intended or not.
I was too "invested" / tired to resist. I don't have an excuse. I will try to do better.
@BoydStephenSmithJr @anne_twain @emilymbender You do have an excuse. You are requesting care.
When I need care, and I am faced with an additional executive function burden, there are three drivers that will push me to accept:
1. They are in a position to refuse me something I need, so I have incentive to accommodate them.
2. I lack the energy to cope with the consequences of refusing. (The "too invested" problem - it takes a lot of energy to interact with medical systems, and when I'm sick, I have less energy to spare.)
3. My ability to cope with decisions is reduced when I need care - the sicker I am, the more I focus on just making it through the next step of the process to obtaining care, and the less externalities matter.The problem isn't you not doing better. The problem is a system set up to make it as hard as possible for you to decline.
And the solution isn't you doing better when you're interacting with the system. The solution is sustained pressure by healthy people when they aren't trying to use the system.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender
One of my first jobs was providing tech support to doctors in a hospital setting. They were some of the most tech-illiterate folks I've ever encountered. They have no concept of operational security.No doctor has ever asked me for permission to store any information about me in whatever systems they're using. For all I know they store it in plain text on an insecure S3 bucket.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender thankfully my therapist was like "yeah dude don't worry about it it's weird" but i still get an email alongside every 'upcoming appointment' email reminding me to sign the permission form
-
@EverydayMoggie @emilymbender That…would honestly scare me more than the initial request, I think. How are you a medical provider and you don’t know what happens when a patient refuses to consent??
@WhiteCatTamer @EverydayMoggie @emilymbender In California, if you refuse, they are legally obligated not to record. California is a two-party consent state. You cannot record anyone's voice for any reason without their consent.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
Yes, and of course said no.
But then I discovered they used AI transcribing when adding notes to my journal, after the meetings, as it was full of obvious errors. So needed to lecture them again about my right it is not used on my medical record.
What makes this even worse is that they all know how bad it works, as it is frequently reported in media about complaints from the medical community about horrific errors, as well as inefficiency this overhyped piece of crap creates.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
@emilymbender Agree. GPs are low availability now, saying no to this means being viewed as difficult, maybe ejected from patient roster. So you can't really say no.
Also in two visits where reports were prepped from specialists, there were errors from AI transcription mishearing that I think a human would not have made (age cited quite differently in different paragraphs, a operation claimed as had which was spoken as DID NOT have, etc) Correction required my time, effort, Dr disfavor 🫤 -
@emilymbender "he fullest version would be too long, so we're not actually doing informed consent?"
No, that is not what is being said there. Unlike a blog post, I am restricted in space. I explicitly said that is is a valid concern. A basic research consent form is 8+ pages of legalese and I'm afraid that the future solution will be to add it as a checkbox for 30 pages of text at check-in that nobody reads and doesn't actually inform better. And again, my point #1.
@P__X You are not restricted in space -- you wrote a whole thread.
My point is: if patients do not know what they are consenting to, it is not consent. If it is not possible in the context of the visit to convey the detail, then we shouldn't do the thing.
I encourage you to read the rest of the replies to my post, including the quotes, to see the lack of consent and how that is landing.
-
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
Why you should refuse to let your doctor record you
By: Emily M. Bender and Decca Muldowney At a recent appointment, Emily’s physical therapist (who knows some about her research) said, “Before we get started,...
(buttondown.com)
1. At my last vet visit, there was a small typed notice across the exam room from the person/animal seating area that said AI is now being used by the practice for all visits, and to assume that if staff are in the room, recording is happening.
2. At my last primary provider visit, I asked the medical assistant if AI was being used, and if she could opt me out. She agreed.
When the PA came into the exam room, her first words were that I needed to prioritize my questions/issues, as she would only be able to deal with two, since she would have to manually chart the whole visit.
(I had come hoping for a prescription refill and two referrals for specialist care.) -
@emilymbender I've noticed a lot of this use in veterinary medicine recently as well, just FYI.
@rbmath @emilymbender I have noticed this too, at my local vet. They have signs at the front desk and in all the rooms about it and letting people know they can opt out. One of the problems is how people tend to react when you say you want to opt out.
-
@BoydStephenSmithJr That ... isn't really consent.
@emilymbender @BoydStephenSmithJr i actively refuse to go to a local provider chain because of the Terms of their EHR software. It basically kept the right to do what it wanted with data.
I always think of how few people would care enough to do that level of reading as well as have the education to catch the nuance and resources to choose another practice.
That is not to shame anyone! It is asking a lot of an individual to do that