https://www.reuters.com/investigations/ai-enters-operating-room-reports-arise-botched-surgeries-misidentified-body-2026-02-09/
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a gonna have to put "NO AI" in my medical record at some point
there are only a very very few fields where AI makes sense in medicine (protein folding and similar very computation heavy protein/gene related things, the alternatives would eat a significant order of magnitude more processing power) and this is nowhere close to that -
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a they had me at "As AI enters the operating room, reports arise of botched surgeries [...]"
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a And I made a joke post about "vibe surgery" just the other day...
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a this has all the energy of https://hackaday.com/2015/10/26/killed-by-a-machine-the-therac-25/
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a can I, like, make an advance directive that if I'm incapacitated and require surgery that no AI-assisted instruments can be used
like a DNR, but DNAI
-
@atax1a this has all the energy of https://hackaday.com/2015/10/26/killed-by-a-machine-the-therac-25/
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a "Are you going to use AI tools?" feels like something you shouldn't have to ask your brain surgeon before they go to town on your noggin.
-
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a Maybe they should have used radium-coated surgical instruments for better precision.
-
R relay@relay.an.exchange shared this topic
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a
Isn't Surgeon Simulator a couple years old? This isn't exactly "news". -
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a one rather important distinction that is often lost at reporters (as part of the general public) is whether we're dealing with ML or with LLM. I've seen my share of absolute bonkers implementation of, well, anything, but I have a hard time believing f'ing LLM's entered the operating theatre.
I'm not decided on whether I prefer to die because of an ML model going off the rails, or an old fashioned coding error like the infamous Therac-25. I've seen code for medical software and I'm not optimistic either way.
Frankly, I prefer doctors who don't Google my symptoms during a GP visit, but I'm afraid that is an art that's dying out.
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a We live in the stupidest timeline.
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
-
@atax1a one rather important distinction that is often lost at reporters (as part of the general public) is whether we're dealing with ML or with LLM. I've seen my share of absolute bonkers implementation of, well, anything, but I have a hard time believing f'ing LLM's entered the operating theatre.
I'm not decided on whether I prefer to die because of an ML model going off the rails, or an old fashioned coding error like the infamous Therac-25. I've seen code for medical software and I'm not optimistic either way.
Frankly, I prefer doctors who don't Google my symptoms during a GP visit, but I'm afraid that is an art that's dying out.
Thank you! If I'm correct, it's like this:
- I support algorithms. Vital automation.
- Neural nets are - if well trained - tested and efficient algorithms.
- Machine learning is a neural net training itself. Now I'm getting sceptical, demand testing and would hope it's not left unattended to "continue learning".But all of the above is targeted at maximizing correctness of answers!!
- Then here are the LLMs: targeted at maximizing plausability! SOUNDING correct is the goal.
Oh and here's reddit, learn that. - _- -
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a Yikes.

-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a
Oh... oh fuck. -
@atax1a one rather important distinction that is often lost at reporters (as part of the general public) is whether we're dealing with ML or with LLM. I've seen my share of absolute bonkers implementation of, well, anything, but I have a hard time believing f'ing LLM's entered the operating theatre.
I'm not decided on whether I prefer to die because of an ML model going off the rails, or an old fashioned coding error like the infamous Therac-25. I've seen code for medical software and I'm not optimistic either way.
Frankly, I prefer doctors who don't Google my symptoms during a GP visit, but I'm afraid that is an art that's dying out.
@bertdriehuis @atax1a Oh, there are absolutely terrible ML implementations out there. Social workers in Denmark got a tool that was supposed to assist them in deciding whether a child should be removed from the home. The strongest feature by far was age of the child (because removing a child is the last thing you try). It was a less than useless linear model.
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a
Yes. You *are* screaming.There is a good reason that, after opening the skull under anaesthetic, the patient is normally awake and talking to the surgical team as they hack away inside the brain. And you've just, uh, put your scalpel on it.
Generally the surgeon will be prodding and poking a particular place to cut, *before* cutting, so they can evaluate the effects on the *person* in the wet electric fat. If something produces odd effects, they look for a way around it.
-
@bertdriehuis @atax1a Oh, there are absolutely terrible ML implementations out there. Social workers in Denmark got a tool that was supposed to assist them in deciding whether a child should be removed from the home. The strongest feature by far was age of the child (because removing a child is the last thing you try). It was a less than useless linear model.
@drgroftehauge @atax1a there are tons of bad models out there, that's a fact. ML is an opaque tool. But an ML model is easier to validate independently. Biases can be shown, and results are reproducible within statistical limits. It is as much a science as statistics are, and those are equally abused in the domain you refer to.
The Netherlands by the way also has its fair share of problematic algorithms based on ML in the social domain. The biggest issue is not ML itself, but the lack of openness and independent validation. If the algorithm were written in a traditional programming language the result would not have been different (and we also have failed examples of those in our governments' recent past).
-
Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.
I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING
@atax1a move fast, break people.
-
R relay@relay.infosec.exchange shared this topic