Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y.

I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y.

Scheduled Pinned Locked Moved Uncategorized
llmgenaiacademiaresearchresearchintegri
37 Posts 32 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

    I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

    #LLM #GenAI #academia #research #ResearchIntegrity #humanities

    j12i@weirder.earthJ This user is from outside of this forum
    j12i@weirder.earthJ This user is from outside of this forum
    j12i@weirder.earth
    wrote last edited by
    #20

    boost with CN: "AI"

    1 Reply Last reply
    0
    • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

      I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

      #LLM #GenAI #academia #research #ResearchIntegrity #humanities

      savvyhomestead@mastodon.socialS This user is from outside of this forum
      savvyhomestead@mastodon.socialS This user is from outside of this forum
      savvyhomestead@mastodon.social
      wrote last edited by
      #21

      @ElenLeFoll

      The obvious end goal of AI is centralized control of information that can be used to bend public opinion, win elections for pedophiles, criminals and set trends.

      Attention Required! | Cloudflare

      favicon

      (cybernews.com)

      1 Reply Last reply
      0
      • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

        I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

        #LLM #GenAI #academia #research #ResearchIntegrity #humanities

        villavelius@mastodon.onlineV This user is from outside of this forum
        villavelius@mastodon.onlineV This user is from outside of this forum
        villavelius@mastodon.online
        wrote last edited by
        #22

        @ElenLeFoll LLMs, like seagulls, swallow anything that's thrown at them. It's known as 'gullibility' for a good reason.

        1 Reply Last reply
        1
        0
        • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

          I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

          #LLM #GenAI #academia #research #ResearchIntegrity #humanities

          grimalkina@mastodon.socialG This user is from outside of this forum
          grimalkina@mastodon.socialG This user is from outside of this forum
          grimalkina@mastodon.social
          wrote last edited by
          #23

          @ElenLeFoll I'm sorry, they put up fake preprints and then said other researchers citing these preprints are the problem? Standing up a fake preprint is absurdly unethical

          1 Reply Last reply
          0
          • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

            I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

            #LLM #GenAI #academia #research #ResearchIntegrity #humanities

            djl@mastodon.mit.eduD This user is from outside of this forum
            djl@mastodon.mit.eduD This user is from outside of this forum
            djl@mastodon.mit.edu
            wrote last edited by
            #24

            @ElenLeFoll

            This is a rerun of the Sokal Hoax.

            Link Preview Image
            Sokal affair - Wikipedia

            favicon

            (en.wikipedia.org)

            Like some here, some folks thought the Sokal hoax was ethically problematic, but the postmodernists needed a wake up call, as do the LLM fans.

            Really: the LLM idea is the stupidest* thing to come out of Computer Science ever. We need to be embarrassed.

            *: Unnecessary explanation: the idea that random text generation has something to do with intelligence is really really stupid.

            1 Reply Last reply
            0
            • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

              I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

              #LLM #GenAI #academia #research #ResearchIntegrity #humanities

              mage_of_chaos@mastodon.socialM This user is from outside of this forum
              mage_of_chaos@mastodon.socialM This user is from outside of this forum
              mage_of_chaos@mastodon.social
              wrote last edited by
              #25

              @ElenLeFoll

              That's something the government does every few years already. Nothing new.

              1 Reply Last reply
              0
              • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

                I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

                #LLM #GenAI #academia #research #ResearchIntegrity #humanities

                A This user is from outside of this forum
                A This user is from outside of this forum
                agreeable_landfall@mastodon.social
                wrote last edited by
                #26

                @ElenLeFoll I'm convinced one of the main reasons we die is that we get too old or too sick to manage our own healthcare. We can't do the research to find the right studies, we can't read or understand those studies, and we can't question our providers to be sure they actually understand what's wrong with us.

                Once we are dependent on mere employees, the quality of care goes way down, and mistakes get made, or our treatment is just ineffective.

                1 Reply Last reply
                0
                • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

                  I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

                  #LLM #GenAI #academia #research #ResearchIntegrity #humanities

                  F This user is from outside of this forum
                  F This user is from outside of this forum
                  failedlyndonlarouchite@mas.to
                  wrote last edited by
                  #27

                  @ElenLeFoll

                  as a scientist, you are surely aware of bias

                  such as the bias on social media, where if AI does something bad, it gets widely disseminated (goes viral)

                  whereas if AI does something good, no one talks about it

                  also,
                  PSA
                  when I was a baby PhD student in 1985, my teachers warned me over and over, don't trust something just cause it is published in a peer reviewed journal
                  be careful of all that you read !!!!!

                  1 Reply Last reply
                  0
                  • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

                    I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

                    #LLM #GenAI #academia #research #ResearchIntegrity #humanities

                    mhoye@cosocial.caM This user is from outside of this forum
                    mhoye@cosocial.caM This user is from outside of this forum
                    mhoye@cosocial.ca
                    wrote last edited by
                    #28

                    @ElenLeFoll You might enjoy my recent experience with raclette maximalism.

                    Link Preview Image
                    Lies, Damned Lies And Stochastics | blarg

                    favicon

                    (exple.tive.org)

                    1 Reply Last reply
                    0
                    • pineywoozle@masto.aiP pineywoozle@masto.ai

                      @ElenLeFoll I love this quote about the original article. “Acknowledgments also thanked “Professor Sideshow Bob” and a professor from the Starfleet Academy for access to a lab aboard the USS Enterprise” 🤣 🤣 🤣 #AI #AiSlop

                      naturemc@mastodon.onlineN This user is from outside of this forum
                      naturemc@mastodon.onlineN This user is from outside of this forum
                      naturemc@mastodon.online
                      wrote last edited by
                      #29

                      @Pineywoozle Yes, big cinema ! 🤣🍿 @ElenLeFoll

                      1 Reply Last reply
                      0
                      • mycotropic@beige.partyM mycotropic@beige.party

                        @bms48 @ElenLeFoll

                        Johnathan Swift described The Machine (for writing) in 1726; https://en.wikipedia.org/wiki/The_Engine

                        nev@flipping.rocksN This user is from outside of this forum
                        nev@flipping.rocksN This user is from outside of this forum
                        nev@flipping.rocks
                        wrote last edited by
                        #30

                        @mycotropic @bms48 @ElenLeFoll shout-out to Ramon Llull: http://www.computer-timeline.com/timeline/ramon-llull/

                        mycotropic@beige.partyM 1 Reply Last reply
                        0
                        • meneerdebruin@mastodon.nlM meneerdebruin@mastodon.nl

                          @ElenLeFoll Good. Poison as much as possible, let it eat its own crap and dance while it burns to the ground.

                          naturemc@mastodon.onlineN This user is from outside of this forum
                          naturemc@mastodon.onlineN This user is from outside of this forum
                          naturemc@mastodon.online
                          wrote last edited by
                          #31

                          @MeneerDeBruin As a professional 100% human writer, I'm indeed interested in how we could use our creativity for that goal! 😁 @ElenLeFoll

                          1 Reply Last reply
                          0
                          • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

                            I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

                            #LLM #GenAI #academia #research #ResearchIntegrity #humanities

                            the_turtle@mastodon.sdf.orgT This user is from outside of this forum
                            the_turtle@mastodon.sdf.orgT This user is from outside of this forum
                            the_turtle@mastodon.sdf.org
                            wrote last edited by
                            #32

                            @ElenLeFoll i *like it a lot* when people teach AI... wrong stuff. Piss in the well at every opportunity, folks.

                            1 Reply Last reply
                            0
                            • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

                              I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

                              #LLM #GenAI #academia #research #ResearchIntegrity #humanities

                              number6@fosstodon.orgN This user is from outside of this forum
                              number6@fosstodon.orgN This user is from outside of this forum
                              number6@fosstodon.org
                              wrote last edited by
                              #33

                              @ElenLeFoll

                              It seems like the real underlying problem is the "publish or perish" syndrome, where the value of a researcher is based on how many papers they write, or how often they're reference.

                              So there's a proliferation of papers, many of which are meaningless, and which no one has time to actually read, being referenced in other papers by other researchers who don't have time to read and evaluate all these other papers.

                              1 Reply Last reply
                              0
                              • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

                                I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

                                #LLM #GenAI #academia #research #ResearchIntegrity #humanities

                                christianschwaegerl@mastodon.socialC This user is from outside of this forum
                                christianschwaegerl@mastodon.socialC This user is from outside of this forum
                                christianschwaegerl@mastodon.social
                                wrote last edited by
                                #34

                                @ElenLeFoll The whole thing is like an oil spill.

                                1 Reply Last reply
                                0
                                • nev@flipping.rocksN nev@flipping.rocks

                                  @mycotropic @bms48 @ElenLeFoll shout-out to Ramon Llull: http://www.computer-timeline.com/timeline/ramon-llull/

                                  mycotropic@beige.partyM This user is from outside of this forum
                                  mycotropic@beige.partyM This user is from outside of this forum
                                  mycotropic@beige.party
                                  wrote last edited by
                                  #35

                                  @nev @bms48 @ElenLeFoll

                                  That's an amazing history and I'll include llull in my "History of AI/LLM" lecture! Thanks!

                                  1 Reply Last reply
                                  0
                                  • pineywoozle@masto.aiP pineywoozle@masto.ai

                                    @ElenLeFoll I love this quote about the original article. “Acknowledgments also thanked “Professor Sideshow Bob” and a professor from the Starfleet Academy for access to a lab aboard the USS Enterprise” 🤣 🤣 🤣 #AI #AiSlop

                                    markiejiang@mastodon.socialM This user is from outside of this forum
                                    markiejiang@mastodon.socialM This user is from outside of this forum
                                    markiejiang@mastodon.social
                                    wrote last edited by
                                    #36

                                    @Pineywoozle @ElenLeFoll what a fun piece of work really, to write made-up scientific articles with all the silly things you want!

                                    1 Reply Last reply
                                    0
                                    • elenlefoll@fediscience.orgE elenlefoll@fediscience.org

                                      I was only made aware of this (frankly awesome) case of LLM poisoning today: https://www.nature.com/articles/d41586-026-01100-y. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!

                                      #LLM #GenAI #academia #research #ResearchIntegrity #humanities

                                      com@mastodon.socialC This user is from outside of this forum
                                      com@mastodon.socialC This user is from outside of this forum
                                      com@mastodon.social
                                      wrote last edited by
                                      #37

                                      @ElenLeFoll Unfortunately it’s impossible to test this now that the veracity of the study has been revealed.

                                      Link Preview Image
                                      1 Reply Last reply
                                      0
                                      Reply
                                      • Reply as topic
                                      Log in to reply
                                      • Oldest to Newest
                                      • Newest to Oldest
                                      • Most Votes


                                      • Login

                                      • Login or register to search.
                                      • First post
                                        Last post
                                      0
                                      • Categories
                                      • Recent
                                      • Tags
                                      • Popular
                                      • World
                                      • Users
                                      • Groups