<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[I take substantial issue with this.]]></title><description><![CDATA[<p class="quote-inline">RE: <a href="https://c.im/@cdarwin/116327241614183410" rel="nofollow noopener"><span>https://</span><span>c.im/@cdarwin/1163272416141834</span><span>10</span></a></p><p>I take substantial issue with this.</p><p>"Psychosis" is something humans develop.</p><p>LLMs cannot develop "psychosis" in any way.</p><p>It is incapable of "hallucinating" either.</p><p>That is improper anthropomorphisation.</p><p>The reality is, LLMs generate inaccuracies, synthesises incorrect data, etc., but these are not "hallucinations."</p><p>People hallucinate. Machines do not.</p>]]></description><link>https://board.circlewithadot.net/topic/a219e4e8-b439-4ee9-95c7-4976b706b5ed/i-take-substantial-issue-with-this.</link><generator>RSS for Node</generator><lastBuildDate>Wed, 08 Apr 2026 13:22:05 GMT</lastBuildDate><atom:link href="https://board.circlewithadot.net/topic/a219e4e8-b439-4ee9-95c7-4976b706b5ed.rss" rel="self" type="application/rss+xml"/><pubDate>Wed, 01 Apr 2026 08:45:07 GMT</pubDate><ttl>60</ttl></channel></rss>