<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs.]]></title><description><![CDATA[<p>I completely understand the position of people who don't want to use LLMs or consume any content produced with LLMs. I do not understand the position of "NO ONE should use LLMs at all" because how are you planning to make that happen? no one should be *forced* to use them, but plenty of people are using them now. it's not something you can wish away or achieve via moral condemnation.</p>]]></description><link>https://board.circlewithadot.net/topic/1346e26d-03f7-4cc6-8bc2-b7fb3275550b/i-completely-understand-the-position-of-people-who-don-t-want-to-use-llms-or-consume-any-content-produced-with-llms.</link><generator>RSS for Node</generator><lastBuildDate>Fri, 17 Apr 2026 12:21:10 GMT</lastBuildDate><atom:link href="https://board.circlewithadot.net/topic/1346e26d-03f7-4cc6-8bc2-b7fb3275550b.rss" rel="self" type="application/rss+xml"/><pubDate>Thu, 26 Mar 2026 21:36:48 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs. on Thu, 26 Mar 2026 21:44:23 GMT]]></title><description><![CDATA[<p><span><a href="/user/lzg%40mastodon.social">@<span>lzg</span></a></span> </p><p>It supposedly causes brain rot, so isn't that a good thing? </p><p>I am morally superior and also very intelligent.</p><p>Frankly what I don't like about is the people that use them confronting you with the slop that you then have to untangle, they are not using it on a remote atol but right in your face, so it can be difficult to ignore.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/hnapel/statuses/116297660047359269</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/hnapel/statuses/116297660047359269</guid><dc:creator><![CDATA[hnapel@mastodon.social]]></dc:creator><pubDate>Thu, 26 Mar 2026 21:44:23 GMT</pubDate></item><item><title><![CDATA[Reply to I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs. on Thu, 26 Mar 2026 21:42:15 GMT]]></title><description><![CDATA[<p><span><a href="/user/lzg%40mastodon.social">@<span>lzg</span></a></span> that is: I do believe in that position, but I also accept that I live in the real world where it cannot be enforced in general and I've gotta pick my battles blah blah</p>]]></description><link>https://board.circlewithadot.net/post/https://hachyderm.io/users/SnoopJ/statuses/116297651668310615</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://hachyderm.io/users/SnoopJ/statuses/116297651668310615</guid><dc:creator><![CDATA[snoopj@hachyderm.io]]></dc:creator><pubDate>Thu, 26 Mar 2026 21:42:15 GMT</pubDate></item><item><title><![CDATA[Reply to I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs. on Thu, 26 Mar 2026 21:42:03 GMT]]></title><description><![CDATA[<p><span><a href="/user/anildash%40me.dm">@<span>anildash</span></a></span> <span><a href="/user/lzg%40mastodon.social">@<span>lzg</span></a></span> so what does harm reduction look like? What's a needle exchange or methadone for LLMs?</p>]]></description><link>https://board.circlewithadot.net/post/https://orbital.horse/users/emma/statuses/116297650889652242</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://orbital.horse/users/emma/statuses/116297650889652242</guid><dc:creator><![CDATA[emma@orbital.horse]]></dc:creator><pubDate>Thu, 26 Mar 2026 21:42:03 GMT</pubDate></item><item><title><![CDATA[Reply to I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs. on Thu, 26 Mar 2026 21:41:18 GMT]]></title><description><![CDATA[<p><span><a href="/user/lzg%40mastodon.social">@<span>lzg</span></a></span> I do distinguish between "nobody should use LLMs [in general]" and "nobody should use LLMs [in this specific context" because in the latter case, there's usually room for "we'll make that policy explicit and enforce it if someone violates the trust it requires"</p><p>the former is understandable but I just route around it. I'm angry enough on my own, I don't really need the feckless ravings of others piling on</p>]]></description><link>https://board.circlewithadot.net/post/https://hachyderm.io/users/SnoopJ/statuses/116297647957541599</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://hachyderm.io/users/SnoopJ/statuses/116297647957541599</guid><dc:creator><![CDATA[snoopj@hachyderm.io]]></dc:creator><pubDate>Thu, 26 Mar 2026 21:41:18 GMT</pubDate></item><item><title><![CDATA[Reply to I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs. on Thu, 26 Mar 2026 21:40:22 GMT]]></title><description><![CDATA[<p><span><a href="/user/lzg%40mastodon.social">@<span>lzg</span></a></span> my issue is, even if you feel that way… what’s the plan? This is the stance that failed with social media, failed with ride sharing apps, failed with crypto. Even if critics were morally right to say “nobody should ever use this”, they didn’t succeed in harm reduction. And that has to matter more than smugly being “right” when the stakes are this high.</p>]]></description><link>https://board.circlewithadot.net/post/https://me.dm/users/anildash/statuses/116297644261228637</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://me.dm/users/anildash/statuses/116297644261228637</guid><dc:creator><![CDATA[anildash@me.dm]]></dc:creator><pubDate>Thu, 26 Mar 2026 21:40:22 GMT</pubDate></item><item><title><![CDATA[Reply to I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs. on Thu, 26 Mar 2026 21:39:25 GMT]]></title><description><![CDATA[<p><span><a href="/user/lzg%40mastodon.social">@<span>lzg</span></a></span> Sabotage!</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/rasterweb/statuses/116297640553055208</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/rasterweb/statuses/116297640553055208</guid><dc:creator><![CDATA[rasterweb@mastodon.social]]></dc:creator><pubDate>Thu, 26 Mar 2026 21:39:25 GMT</pubDate></item><item><title><![CDATA[Reply to I completely understand the position of people who don&#x27;t want to use LLMs or consume any content produced with LLMs. on Thu, 26 Mar 2026 21:38:48 GMT]]></title><description><![CDATA[<p>i just want to know what the the theory of change is, beyond being really angry at the whole thing</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/lzg/statuses/116297638112417884</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/lzg/statuses/116297638112417884</guid><dc:creator><![CDATA[lzg@mastodon.social]]></dc:creator><pubDate>Thu, 26 Mar 2026 21:38:48 GMT</pubDate></item></channel></rss>