<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y.]]></title><description><![CDATA[<p>I was only made aware of this (frankly awesome) case of LLM poisoning today: <a href="https://www.nature.com/articles/d41586-026-01100-y" rel="nofollow noopener"><span>https://www.</span><span>nature.com/articles/d41586-026</span><span>-01100-y</span></a>. A researcher made up a disease and published two evidently fake preprints about it (including sentences such as “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”), which were almost immediately picked up by LLMs and documented in their output. Worse, actual – supposedly serious – medical papers also started citing the preprints, demonstrating that academics relying on LLMs to do their work is a genuine problem! Not that I had my doubts but, if anyone did, this seems like the perfect demonstration of the problem. Article immediately added to the syllabus of the class I am co-teaching with Iris Ferrazzo on LLMs for Romance Studies/Humanities!</p><p><a href="https://fediscience.org/tags/LLM" rel="tag">#<span>LLM</span></a> <a href="https://fediscience.org/tags/GenAI" rel="tag">#<span>GenAI</span></a> <a href="https://fediscience.org/tags/academia" rel="tag">#<span>academia</span></a> <a href="https://fediscience.org/tags/research" rel="tag">#<span>research</span></a> <a href="https://fediscience.org/tags/ResearchIntegrity" rel="tag">#<span>ResearchIntegrity</span></a> <a href="https://fediscience.org/tags/humanities" rel="tag">#<span>humanities</span></a></p>]]></description><link>https://board.circlewithadot.net/topic/174e9258-6d87-414a-9e53-e6173be9415e/i-was-only-made-aware-of-this-frankly-awesome-case-of-llm-poisoning-today-https-www.nature.com-articles-d41586-026-01100-y.</link><generator>RSS for Node</generator><lastBuildDate>Fri, 15 May 2026 01:48:46 GMT</lastBuildDate><atom:link href="https://board.circlewithadot.net/topic/174e9258-6d87-414a-9e53-e6173be9415e.rss" rel="self" type="application/rss+xml"/><pubDate>Wed, 13 May 2026 06:24:08 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 16:27:48 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> Unfortunately it’s impossible to test this now that the veracity of the study has been revealed.</p>

<div class="row mt-3"><div class="col-12 mt-3"><img class="img-thumbnail" src="https://files.mastodon.social/media_attachments/files/116/568/206/032/435/223/original/48512c4ce01ac46f.jpeg" alt="Link Preview Image" /></div></div>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/com/statuses/116568206116314496</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/com/statuses/116568206116314496</guid><dc:creator><![CDATA[com@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 16:27:48 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 16:24:14 GMT]]></title><description><![CDATA[<p><span><a href="/user/pineywoozle%40masto.ai">@<span>Pineywoozle</span></a></span> <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> what a fun piece of work really, to write made-up scientific articles with all the silly things you want!</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/markiejiang/statuses/116568192087418124</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/markiejiang/statuses/116568192087418124</guid><dc:creator><![CDATA[markiejiang@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 16:24:14 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 16:14:30 GMT]]></title><description><![CDATA[<p><span><a href="/user/nev%40flipping.rocks">@<span>nev</span></a></span> <span><a href="/user/bms48%40mastodon.social">@<span>bms48</span></a></span> <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> </p><p>That's an amazing history and I'll include llull in my "History of AI/LLM" lecture! Thanks!</p>]]></description><link>https://board.circlewithadot.net/post/https://beige.party/users/mycotropic/statuses/116568153821081881</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://beige.party/users/mycotropic/statuses/116568153821081881</guid><dc:creator><![CDATA[mycotropic@beige.party]]></dc:creator><pubDate>Wed, 13 May 2026 16:14:30 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 15:32:39 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> The whole thing is like an oil spill.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/christianschwaegerl/statuses/116567989261207712</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/christianschwaegerl/statuses/116567989261207712</guid><dc:creator><![CDATA[christianschwaegerl@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 15:32:39 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 15:07:24 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> </p><p>It seems like the real underlying problem is the "publish or perish" syndrome, where the value of a researcher is based on how many papers they write, or how often they're reference.</p><p>So there's a proliferation of papers, many of which are meaningless, and which no one has time to actually read, being referenced in other papers by other researchers who don't have time to read and evaluate all these other papers.</p>]]></description><link>https://board.circlewithadot.net/post/https://fosstodon.org/users/number6/statuses/116567889951636532</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://fosstodon.org/users/number6/statuses/116567889951636532</guid><dc:creator><![CDATA[number6@fosstodon.org]]></dc:creator><pubDate>Wed, 13 May 2026 15:07:24 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 15:03:12 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> i *like it a lot* when people teach AI... wrong stuff. Piss in the well at every opportunity, folks.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.sdf.org/users/the_turtle/statuses/116567873483129178</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.sdf.org/users/the_turtle/statuses/116567873483129178</guid><dc:creator><![CDATA[the_turtle@mastodon.sdf.org]]></dc:creator><pubDate>Wed, 13 May 2026 15:03:12 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:53:07 GMT]]></title><description><![CDATA[<p><span><a href="/user/meneerdebruin%40mastodon.nl">@<span>MeneerDeBruin</span></a></span> As a professional 100% human writer, I'm indeed interested in how we could use our creativity for that goal! <img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f601.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--grin" style="height:23px;width:auto;vertical-align:middle" title="😁" alt="😁" />  <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span></p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/NatureMC/statuses/116567833816450131</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/NatureMC/statuses/116567833816450131</guid><dc:creator><![CDATA[naturemc@mastodon.online]]></dc:creator><pubDate>Wed, 13 May 2026 14:53:07 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:52:50 GMT]]></title><description><![CDATA[<p><span><a href="/user/mycotropic%40beige.party">@<span>mycotropic</span></a></span> <span><a href="/user/bms48%40mastodon.social">@<span>bms48</span></a></span> <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> shout-out to Ramon Llull: <a href="http://www.computer-timeline.com/timeline/ramon-llull/" rel="nofollow noopener"><span>http://www.</span><span>computer-timeline.com/timeline</span><span>/ramon-llull/</span></a></p>]]></description><link>https://board.circlewithadot.net/post/https://flipping.rocks/users/nev/statuses/116567832709906763</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://flipping.rocks/users/nev/statuses/116567832709906763</guid><dc:creator><![CDATA[nev@flipping.rocks]]></dc:creator><pubDate>Wed, 13 May 2026 14:52:50 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:49:39 GMT]]></title><description><![CDATA[<p><span><a href="/user/pineywoozle%40masto.ai">@<span>Pineywoozle</span></a></span> Yes, big cinema ! <img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f923.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--rolling_on_the_floor_laughing" style="height:23px;width:auto;vertical-align:middle" title="🤣" alt="🤣" /><img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f37f.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--popcorn" style="height:23px;width:auto;vertical-align:middle" title="🍿" alt="🍿" />  <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span></p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/NatureMC/statuses/116567820145045856</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/NatureMC/statuses/116567820145045856</guid><dc:creator><![CDATA[naturemc@mastodon.online]]></dc:creator><pubDate>Wed, 13 May 2026 14:49:39 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:38:24 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> You might enjoy my recent experience with raclette maximalism.</p><p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">



<a href="https://exple.tive.org/blarg/2026/05/02/lies-damned-lies-and-stochastics/" title="Lies, Damned Lies And Stochastics | blarg">
<img src="https://live.staticflickr.com/65535/55244712641_da5983bfd9_o.png" class="card-img-top not-responsive" style="max-height: 15rem;" alt="Link Preview Image" />
</a>



<div class="card-body">
<h5 class="card-title">
<a href="https://exple.tive.org/blarg/2026/05/02/lies-damned-lies-and-stochastics/">
Lies, Damned Lies And Stochastics | blarg
</a>
</h5>
<p class="card-text line-clamp-3"></p>
</div>
<a href="https://exple.tive.org/blarg/2026/05/02/lies-damned-lies-and-stochastics/" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://exple.tive.org/favicon.ico" alt="favicon" class="not-responsive overflow-hiddden" style="max-width: 21px; max-height: 21px;" />



<p class="d-inline-block text-truncate mb-0"> <span class="text-secondary">(exple.tive.org)</span></p>
</a>
</div></p>]]></description><link>https://board.circlewithadot.net/post/https://cosocial.ca/users/mhoye/statuses/116567775927334612</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://cosocial.ca/users/mhoye/statuses/116567775927334612</guid><dc:creator><![CDATA[mhoye@cosocial.ca]]></dc:creator><pubDate>Wed, 13 May 2026 14:38:24 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:33:56 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> </p><p>as a scientist, you are surely aware of bias</p><p>such as the bias on social media, where if AI does something bad, it gets widely disseminated (goes viral) </p><p>whereas if AI does something good, no one talks about it</p><p>also,<br />PSA<br />when I was a baby PhD student in 1985, my teachers warned me over and over, don't trust something just cause it is published in a peer reviewed journal<br />be careful of all that you read !!!!!</p>]]></description><link>https://board.circlewithadot.net/post/https://mas.to/users/failedLyndonLaRouchite/statuses/116567758340046320</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mas.to/users/failedLyndonLaRouchite/statuses/116567758340046320</guid><dc:creator><![CDATA[failedlyndonlarouchite@mas.to]]></dc:creator><pubDate>Wed, 13 May 2026 14:33:56 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:28:31 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> I'm convinced one of the main reasons we die is that we get too old or too sick to manage our own healthcare. We can't do the research to find the right studies, we can't read or understand those studies, and we can't question our providers to be sure they actually understand what's wrong with us.</p><p>Once we are dependent on mere employees, the quality of care goes way down, and mistakes get made, or our treatment is just ineffective.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/agreeable_landfall/statuses/116567737096766785</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/agreeable_landfall/statuses/116567737096766785</guid><dc:creator><![CDATA[agreeable_landfall@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 14:28:31 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:11:59 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> </p><p>That's something the government does every few years already. Nothing new.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/Mage_of_Chaos/statuses/116567672041324292</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/Mage_of_Chaos/statuses/116567672041324292</guid><dc:creator><![CDATA[mage_of_chaos@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 14:11:59 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 14:10:36 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> </p><p>This is a rerun of the Sokal Hoax.</p><p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">



<a href="https://en.wikipedia.org/wiki/Sokal_affair" title="Sokal affair - Wikipedia">
<img src="https://en.wikipedia.org/static/images/icons/enwiki-25.svg" class="card-img-top not-responsive" style="max-height: 15rem;" alt="Link Preview Image" />
</a>















<div class="card-body">
<h5 class="card-title">
<a href="https://en.wikipedia.org/wiki/Sokal_affair">
Sokal affair - Wikipedia
</a>
</h5>
<p class="card-text line-clamp-3"></p>
</div>
<a href="https://en.wikipedia.org/wiki/Sokal_affair" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://en.wikipedia.org/static/favicon/wikipedia.ico" alt="favicon" class="not-responsive overflow-hiddden" style="max-width: 21px; max-height: 21px;" />





<p class="d-inline-block text-truncate mb-0"> <span class="text-secondary">(en.wikipedia.org)</span></p>
</a>
</div></p><p>Like some here, some folks thought the Sokal hoax was ethically problematic, but the postmodernists needed a wake up call, as do the LLM fans.</p><p>Really: the LLM idea is the stupidest* thing to come out of Computer Science ever. We need to be embarrassed.</p><p>*: Unnecessary explanation: the idea that random text generation has something to do with intelligence is really really stupid.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.mit.edu/users/djl/statuses/116567666642025351</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.mit.edu/users/djl/statuses/116567666642025351</guid><dc:creator><![CDATA[djl@mastodon.mit.edu]]></dc:creator><pubDate>Wed, 13 May 2026 14:10:36 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 13:44:17 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> I'm sorry, they put up fake preprints and then said other researchers citing these preprints are the problem? Standing up a fake preprint is absurdly unethical</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/grimalkina/statuses/116567563144276971</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/grimalkina/statuses/116567563144276971</guid><dc:creator><![CDATA[grimalkina@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 13:44:17 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 13:40:17 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> LLMs, like seagulls, swallow anything that's thrown at them. It's known as 'gullibility' for a good reason.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/villavelius/statuses/116567547427193392</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/villavelius/statuses/116567547427193392</guid><dc:creator><![CDATA[villavelius@mastodon.online]]></dc:creator><pubDate>Wed, 13 May 2026 13:40:17 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 13:12:48 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span>  </p><p>The obvious end goal of AI is centralized control of information that can be used to bend public opinion, win elections for pedophiles, criminals and set trends.</p><p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">

<div class="card-body">
<h5 class="card-title">
<a href="https://cybernews.com/ai-news/gpt-grok-push-paid-results-flights/">
Attention Required! | Cloudflare
</a>
</h5>
<p class="card-text line-clamp-3"></p>
</div>
<a href="https://cybernews.com/ai-news/gpt-grok-push-paid-results-flights/" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://cybernews.com/favicon.ico" alt="favicon" class="not-responsive overflow-hiddden" style="max-width: 21px; max-height: 21px;" />



<p class="d-inline-block text-truncate mb-0"> <span class="text-secondary">(cybernews.com)</span></p>
</a>
</div></p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/Savvyhomestead/statuses/116567439333633234</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/Savvyhomestead/statuses/116567439333633234</guid><dc:creator><![CDATA[savvyhomestead@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 13:12:48 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 13:10:19 GMT]]></title><description><![CDATA[<p>boost with CN: "AI"</p>]]></description><link>https://board.circlewithadot.net/post/https://weirder.earth/users/j12i/statuses/116567429576806100</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://weirder.earth/users/j12i/statuses/116567429576806100</guid><dc:creator><![CDATA[j12i@weirder.earth]]></dc:creator><pubDate>Wed, 13 May 2026 13:10:19 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 12:59:50 GMT]]></title><description><![CDATA[<p><span><a href="https://aus.social/@bencourtice">@<span>bencourtice</span></a></span> <span><a href="https://social.chinwag.org/@FediThing">@<span>FediThing</span></a></span> <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> </p><p>We have a whole module on how to cite and also how AI works (ethics, errors, foundational knowledge, all that) for our freshman public health students. We STILL had over 10% academic affairs referrals for using hidden prompts and hallucinated citations and that's with a requirement that they give us annotated PDF copies of every cited paper.</p>]]></description><link>https://board.circlewithadot.net/post/https://beige.party/users/mycotropic/statuses/116567388360000054</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://beige.party/users/mycotropic/statuses/116567388360000054</guid><dc:creator><![CDATA[mycotropic@beige.party]]></dc:creator><pubDate>Wed, 13 May 2026 12:59:50 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 12:54:53 GMT]]></title><description><![CDATA[<p><span><a href="/user/bms48%40mastodon.social">@<span>bms48</span></a></span> <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> </p><p>Johnathan Swift described The Machine (for writing) in 1726; <a href="https://en.wikipedia.org/wiki/The_Engine" rel="nofollow noopener"><span>https://</span><span>en.wikipedia.org/wiki/The_Engi</span><span>ne</span></a></p>]]></description><link>https://board.circlewithadot.net/post/https://beige.party/users/mycotropic/statuses/116567368878141503</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://beige.party/users/mycotropic/statuses/116567368878141503</guid><dc:creator><![CDATA[mycotropic@beige.party]]></dc:creator><pubDate>Wed, 13 May 2026 12:54:53 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 12:51:09 GMT]]></title><description><![CDATA[<p><span><a href="/user/mkljczk%40pl.fediverse.pl">@<span>mkljczk</span></a></span> <span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> <br />Good point.</p><p>Google, if Gemini is as useful as you hope it will be, it is inevitable that it will just come to be known as "Google" and AI answers to direct questions is just a feature of Google search.</p><p>Google, if Gemini is unreliable and can not be reliable, why are you letting it tarnish your brand?</p>]]></description><link>https://board.circlewithadot.net/post/https://mstdn.social/users/Urban_Hermit/statuses/116567354237961358</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mstdn.social/users/Urban_Hermit/statuses/116567354237961358</guid><dc:creator><![CDATA[urban_hermit@mstdn.social]]></dc:creator><pubDate>Wed, 13 May 2026 12:51:09 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 12:45:33 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span>   </p><p>Some do it, DELIBERATELY with a lot of other subjects too.  AI is amoral especially if the material it is fed is amoral and we know a lot of those tech guys and politicians ARE amoral.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/Savvyhomestead/statuses/116567332207219645</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/Savvyhomestead/statuses/116567332207219645</guid><dc:creator><![CDATA[savvyhomestead@mastodon.social]]></dc:creator><pubDate>Wed, 13 May 2026 12:45:33 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 12:45:32 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> <span><a href="/user/kunev%40blewsky.social">@<span>kunev</span></a></span> <br /><img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f602.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--joy" style="height:23px;width:auto;vertical-align:middle" title="😂" alt="😂" /><br /><img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f3af.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--dart" style="height:23px;width:auto;vertical-align:middle" title="🎯" alt="🎯" /></p>]]></description><link>https://board.circlewithadot.net/post/https://piaille.fr/users/OrangeR/statuses/116567332110571620</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://piaille.fr/users/OrangeR/statuses/116567332110571620</guid><dc:creator><![CDATA[oranger@piaille.fr]]></dc:creator><pubDate>Wed, 13 May 2026 12:45:32 GMT</pubDate></item><item><title><![CDATA[Reply to I was only made aware of this (frankly awesome) case of LLM poisoning today: https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;d41586-026-01100-y. on Wed, 13 May 2026 12:20:08 GMT]]></title><description><![CDATA[<p><span><a href="/user/elenlefoll%40fediscience.org">@<span>ElenLeFoll</span></a></span> that iMac is approaching his 30s. he's earned the right to be a hypochondriac sometimes.</p>]]></description><link>https://board.circlewithadot.net/post/https://social.cologne/users/kolya/statuses/116567232223900149</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://social.cologne/users/kolya/statuses/116567232223900149</guid><dc:creator><![CDATA[kolya@social.cologne]]></dc:creator><pubDate>Wed, 13 May 2026 12:20:08 GMT</pubDate></item></channel></rss>