Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. https://www.reuters.com/investigations/ai-enters-operating-room-reports-arise-botched-surgeries-misidentified-body-2026-02-09/

https://www.reuters.com/investigations/ai-enters-operating-room-reports-arise-botched-surgeries-misidentified-body-2026-02-09/

Scheduled Pinned Locked Moved Uncategorized
21 Posts 18 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • atax1a@infosec.exchangeA This user is from outside of this forum
    atax1a@infosec.exchangeA This user is from outside of this forum
    atax1a@infosec.exchange
    wrote last edited by
    #1

    reuters.com

    favicon

    (www.reuters.com)

    Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

    I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

    chaos@gts.schizofucked.monsterC rakslice@mastodon.socialR fentiger@mastodon.socialF wbftw@hachyderm.ioW aburka@hachyderm.ioA 15 Replies Last reply
    1
    0
    • atax1a@infosec.exchangeA atax1a@infosec.exchange

      reuters.com

      favicon

      (www.reuters.com)

      Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

      I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

      chaos@gts.schizofucked.monsterC This user is from outside of this forum
      chaos@gts.schizofucked.monsterC This user is from outside of this forum
      chaos@gts.schizofucked.monster
      wrote last edited by
      #2

      @atax1a gonna have to put "NO AI" in my medical record at some point
      there are only a very very few fields where AI makes sense in medicine (protein folding and similar very computation heavy protein/gene related things, the alternatives would eat a significant order of magnitude more processing power) and this is nowhere close to that

      1 Reply Last reply
      0
      • atax1a@infosec.exchangeA atax1a@infosec.exchange

        reuters.com

        favicon

        (www.reuters.com)

        Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

        I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

        rakslice@mastodon.socialR This user is from outside of this forum
        rakslice@mastodon.socialR This user is from outside of this forum
        rakslice@mastodon.social
        wrote last edited by
        #3

        @atax1a they had me at "As AI enters the operating room, reports arise of botched surgeries [...]"

        1 Reply Last reply
        0
        • atax1a@infosec.exchangeA atax1a@infosec.exchange

          reuters.com

          favicon

          (www.reuters.com)

          Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

          I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

          fentiger@mastodon.socialF This user is from outside of this forum
          fentiger@mastodon.socialF This user is from outside of this forum
          fentiger@mastodon.social
          wrote last edited by
          #4

          @atax1a And I made a joke post about "vibe surgery" just the other day...

          1 Reply Last reply
          0
          • atax1a@infosec.exchangeA atax1a@infosec.exchange

            reuters.com

            favicon

            (www.reuters.com)

            Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

            I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

            wbftw@hachyderm.ioW This user is from outside of this forum
            wbftw@hachyderm.ioW This user is from outside of this forum
            wbftw@hachyderm.io
            wrote last edited by
            #5

            @atax1a this has all the energy of https://hackaday.com/2015/10/26/killed-by-a-machine-the-therac-25/

            aburka@hachyderm.ioA 1 Reply Last reply
            0
            • atax1a@infosec.exchangeA atax1a@infosec.exchange

              reuters.com

              favicon

              (www.reuters.com)

              Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

              I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

              aburka@hachyderm.ioA This user is from outside of this forum
              aburka@hachyderm.ioA This user is from outside of this forum
              aburka@hachyderm.io
              wrote last edited by
              #6

              @atax1a can I, like, make an advance directive that if I'm incapacitated and require surgery that no AI-assisted instruments can be used

              like a DNR, but DNAI

              1 Reply Last reply
              0
              • wbftw@hachyderm.ioW wbftw@hachyderm.io

                @atax1a this has all the energy of https://hackaday.com/2015/10/26/killed-by-a-machine-the-therac-25/

                aburka@hachyderm.ioA This user is from outside of this forum
                aburka@hachyderm.ioA This user is from outside of this forum
                aburka@hachyderm.io
                wrote last edited by
                #7

                @wbftw @atax1a I feel like that was back when there were consequences for doing bad things

                wbftw@hachyderm.ioW 1 Reply Last reply
                0
                • atax1a@infosec.exchangeA atax1a@infosec.exchange

                  reuters.com

                  favicon

                  (www.reuters.com)

                  Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                  I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                  chrisp@cyberplace.socialC This user is from outside of this forum
                  chrisp@cyberplace.socialC This user is from outside of this forum
                  chrisp@cyberplace.social
                  wrote last edited by
                  #8

                  @atax1a "Are you going to use AI tools?" feels like something you shouldn't have to ask your brain surgeon before they go to town on your noggin.

                  1 Reply Last reply
                  0
                  • aburka@hachyderm.ioA aburka@hachyderm.io

                    @wbftw @atax1a I feel like that was back when there were consequences for doing bad things

                    wbftw@hachyderm.ioW This user is from outside of this forum
                    wbftw@hachyderm.ioW This user is from outside of this forum
                    wbftw@hachyderm.io
                    wrote last edited by
                    #9

                    @aburka @atax1a indeed, although it took awhile for the “FO” part of “FAFO” in that particular case.

                    1 Reply Last reply
                    0
                    • atax1a@infosec.exchangeA atax1a@infosec.exchange

                      reuters.com

                      favicon

                      (www.reuters.com)

                      Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                      I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                      njsg@mementomori.socialN This user is from outside of this forum
                      njsg@mementomori.socialN This user is from outside of this forum
                      njsg@mementomori.social
                      wrote last edited by
                      #10

                      @atax1a Maybe they should have used radium-coated surgical instruments for better precision.

                      1 Reply Last reply
                      1
                      0
                      • R relay@relay.an.exchange shared this topic
                      • atax1a@infosec.exchangeA atax1a@infosec.exchange

                        reuters.com

                        favicon

                        (www.reuters.com)

                        Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                        I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                        sharkattak@masto.aiS This user is from outside of this forum
                        sharkattak@masto.aiS This user is from outside of this forum
                        sharkattak@masto.ai
                        wrote last edited by
                        #11

                        @atax1a
                        Isn't Surgeon Simulator a couple years old? This isn't exactly "news".

                        1 Reply Last reply
                        0
                        • atax1a@infosec.exchangeA atax1a@infosec.exchange

                          reuters.com

                          favicon

                          (www.reuters.com)

                          Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                          I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                          bertdriehuis@infosec.exchangeB This user is from outside of this forum
                          bertdriehuis@infosec.exchangeB This user is from outside of this forum
                          bertdriehuis@infosec.exchange
                          wrote last edited by
                          #12

                          @atax1a one rather important distinction that is often lost at reporters (as part of the general public) is whether we're dealing with ML or with LLM. I've seen my share of absolute bonkers implementation of, well, anything, but I have a hard time believing f'ing LLM's entered the operating theatre.

                          I'm not decided on whether I prefer to die because of an ML model going off the rails, or an old fashioned coding error like the infamous Therac-25. I've seen code for medical software and I'm not optimistic either way.

                          Frankly, I prefer doctors who don't Google my symptoms during a GP visit, but I'm afraid that is an art that's dying out.

                          christianriegel@digitalcourage.socialC drgroftehauge@sigmoid.socialD 2 Replies Last reply
                          0
                          • atax1a@infosec.exchangeA atax1a@infosec.exchange

                            reuters.com

                            favicon

                            (www.reuters.com)

                            Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                            I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                            tsturm@famichiki.jpT This user is from outside of this forum
                            tsturm@famichiki.jpT This user is from outside of this forum
                            tsturm@famichiki.jp
                            wrote last edited by
                            #13

                            @atax1a We live in the stupidest timeline.

                            1 Reply Last reply
                            0
                            • atax1a@infosec.exchangeA atax1a@infosec.exchange

                              reuters.com

                              favicon

                              (www.reuters.com)

                              Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                              I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                              energisch_@troet.cafeE This user is from outside of this forum
                              energisch_@troet.cafeE This user is from outside of this forum
                              energisch_@troet.cafe
                              wrote last edited by
                              #14

                              @atax1a 🤯

                              1 Reply Last reply
                              0
                              • bertdriehuis@infosec.exchangeB bertdriehuis@infosec.exchange

                                @atax1a one rather important distinction that is often lost at reporters (as part of the general public) is whether we're dealing with ML or with LLM. I've seen my share of absolute bonkers implementation of, well, anything, but I have a hard time believing f'ing LLM's entered the operating theatre.

                                I'm not decided on whether I prefer to die because of an ML model going off the rails, or an old fashioned coding error like the infamous Therac-25. I've seen code for medical software and I'm not optimistic either way.

                                Frankly, I prefer doctors who don't Google my symptoms during a GP visit, but I'm afraid that is an art that's dying out.

                                christianriegel@digitalcourage.socialC This user is from outside of this forum
                                christianriegel@digitalcourage.socialC This user is from outside of this forum
                                christianriegel@digitalcourage.social
                                wrote last edited by
                                #15

                                @atax1a @bertdriehuis

                                Thank you! If I'm correct, it's like this:

                                - I support algorithms. Vital automation.
                                - Neural nets are - if well trained - tested and efficient algorithms.
                                - Machine learning is a neural net training itself. Now I'm getting sceptical, demand testing and would hope it's not left unattended to "continue learning".

                                But all of the above is targeted at maximizing correctness of answers!!

                                - Then here are the LLMs: targeted at maximizing plausability! SOUNDING correct is the goal.
                                Oh and here's reddit, learn that. - _-

                                1 Reply Last reply
                                0
                                • atax1a@infosec.exchangeA atax1a@infosec.exchange

                                  reuters.com

                                  favicon

                                  (www.reuters.com)

                                  Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                                  I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                                  ashmire@pagan.plusA This user is from outside of this forum
                                  ashmire@pagan.plusA This user is from outside of this forum
                                  ashmire@pagan.plus
                                  wrote last edited by
                                  #16

                                  @atax1a Yikes. 😬

                                  1 Reply Last reply
                                  0
                                  • atax1a@infosec.exchangeA atax1a@infosec.exchange

                                    reuters.com

                                    favicon

                                    (www.reuters.com)

                                    Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                                    I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                                    ozzelot@mstdn.socialO This user is from outside of this forum
                                    ozzelot@mstdn.socialO This user is from outside of this forum
                                    ozzelot@mstdn.social
                                    wrote last edited by
                                    #17

                                    @atax1a
                                    Oh... oh fuck.

                                    1 Reply Last reply
                                    0
                                    • bertdriehuis@infosec.exchangeB bertdriehuis@infosec.exchange

                                      @atax1a one rather important distinction that is often lost at reporters (as part of the general public) is whether we're dealing with ML or with LLM. I've seen my share of absolute bonkers implementation of, well, anything, but I have a hard time believing f'ing LLM's entered the operating theatre.

                                      I'm not decided on whether I prefer to die because of an ML model going off the rails, or an old fashioned coding error like the infamous Therac-25. I've seen code for medical software and I'm not optimistic either way.

                                      Frankly, I prefer doctors who don't Google my symptoms during a GP visit, but I'm afraid that is an art that's dying out.

                                      drgroftehauge@sigmoid.socialD This user is from outside of this forum
                                      drgroftehauge@sigmoid.socialD This user is from outside of this forum
                                      drgroftehauge@sigmoid.social
                                      wrote last edited by
                                      #18

                                      @bertdriehuis @atax1a Oh, there are absolutely terrible ML implementations out there. Social workers in Denmark got a tool that was supposed to assist them in deciding whether a child should be removed from the home. The strongest feature by far was age of the child (because removing a child is the last thing you try). It was a less than useless linear model.

                                      bertdriehuis@infosec.exchangeB 1 Reply Last reply
                                      0
                                      • atax1a@infosec.exchangeA atax1a@infosec.exchange

                                        reuters.com

                                        favicon

                                        (www.reuters.com)

                                        Most allegedly involved errors in which the TruDi Navigation System misinformed surgeons about the location of their instruments while they were using them inside patients’ heads during operations.

                                        I AM FUCKING SCREAMING I AM FUCKING SCREAMING I AM FUCKING SCREAMING

                                        wellsitegeo@masto.aiW This user is from outside of this forum
                                        wellsitegeo@masto.aiW This user is from outside of this forum
                                        wellsitegeo@masto.ai
                                        wrote last edited by
                                        #19

                                        @atax1a
                                        Yes. You *are* screaming.

                                        There is a good reason that, after opening the skull under anaesthetic, the patient is normally awake and talking to the surgical team as they hack away inside the brain. And you've just, uh, put your scalpel on it.

                                        Generally the surgeon will be prodding and poking a particular place to cut, *before* cutting, so they can evaluate the effects on the *person* in the wet electric fat. If something produces odd effects, they look for a way around it.

                                        1 Reply Last reply
                                        0
                                        • drgroftehauge@sigmoid.socialD drgroftehauge@sigmoid.social

                                          @bertdriehuis @atax1a Oh, there are absolutely terrible ML implementations out there. Social workers in Denmark got a tool that was supposed to assist them in deciding whether a child should be removed from the home. The strongest feature by far was age of the child (because removing a child is the last thing you try). It was a less than useless linear model.

                                          bertdriehuis@infosec.exchangeB This user is from outside of this forum
                                          bertdriehuis@infosec.exchangeB This user is from outside of this forum
                                          bertdriehuis@infosec.exchange
                                          wrote last edited by
                                          #20

                                          @drgroftehauge @atax1a there are tons of bad models out there, that's a fact. ML is an opaque tool. But an ML model is easier to validate independently. Biases can be shown, and results are reproducible within statistical limits. It is as much a science as statistics are, and those are equally abused in the domain you refer to.

                                          The Netherlands by the way also has its fair share of problematic algorithms based on ML in the social domain. The biggest issue is not ML itself, but the lack of openness and independent validation. If the algorithm were written in a traditional programming language the result would not have been different (and we also have failed examples of those in our governments' recent past).

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups