<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[My manager is going to open a team discussion about #AI &#x2F; #LLM use for software development.]]></title><description><![CDATA[<p>My manager is going to open a team discussion about <a href="https://floss.social/tags/AI" rel="tag">#<span>AI</span></a> / <a href="https://floss.social/tags/LLM" rel="tag">#<span>LLM</span></a> use for software development. Much of the company uses <a href="https://floss.social/tags/GitHubCopilot" rel="tag">#<span>GitHubCopilot</span></a> or <a href="https://floss.social/tags/Claude" rel="tag">#<span>Claude</span></a> and <a href="https://floss.social/tags/ClaudeCode" rel="tag">#<span>ClaudeCode</span></a>, while AFAIK I am the only one refusing to use these tools.</p><p>What are some recent studies I can reference about the harms of LLM use? I know there's stuff about deskilling, environmental harms, and code quality, but I won't have a chance to research specific studies until the end of my work day.</p><p><a href="https://floss.social/tags/askFedi" rel="tag">#<span>askFedi</span></a> <a href="https://floss.social/tags/NoAI" rel="tag">#<span>NoAI</span></a> <a href="https://floss.social/tags/pleaseBoost" rel="tag">#<span>pleaseBoost</span></a></p>]]></description><link>https://board.circlewithadot.net/topic/0f9f8050-7f30-4de8-8ff8-80a6cac6ca62/my-manager-is-going-to-open-a-team-discussion-about-ai-llm-use-for-software-development.</link><generator>RSS for Node</generator><lastBuildDate>Thu, 14 May 2026 23:25:05 GMT</lastBuildDate><atom:link href="https://board.circlewithadot.net/topic/0f9f8050-7f30-4de8-8ff8-80a6cac6ca62.rss" rel="self" type="application/rss+xml"/><pubDate>Thu, 14 May 2026 19:07:17 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to My manager is going to open a team discussion about #AI &#x2F; #LLM use for software development. on Thu, 14 May 2026 19:43:37 GMT]]></title><description><![CDATA[<p><span><a href="/user/shadow53%40floss.social">@<span>shadow53</span></a></span> </p><p></p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">



<a href="https://ai-project-website.github.io/AI-assistance-reduces-persistence/" title="AI Assistance Reduces Persistence and Hurts Independent Performance">
<img src="https://ai-project-website.github.io/AI-assistance-reduces-persistence/static/images/exp_design_figure.png" class="card-img-top not-responsive" style="max-height:15rem" alt="Link Preview Image" />
</a>



<div class="card-body">
<h5 class="card-title">
<a href="https://ai-project-website.github.io/AI-assistance-reduces-persistence/">
AI Assistance Reduces Persistence and Hurts Independent Performance
</a>
</h5>
<p class="card-text line-clamp-3">AI Assistance Reduces Persistence and Hurts Independent Performance</p>
</div>
<a href="https://ai-project-website.github.io/AI-assistance-reduces-persistence/" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://ai-project-website.github.io/AI-assistance-reduces-persistence/static/images/icon.png" alt="favicon" class="not-responsive overflow-hiddden" style="max-width:21px;max-height:21px" />



<p class="d-inline-block text-truncate mb-0"> <span class="text-secondary">(ai-project-website.github.io)</span></p>
</a>
</div><p></p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/mastodonmigration/statuses/116574638390114969</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/mastodonmigration/statuses/116574638390114969</guid><dc:creator><![CDATA[mastodonmigration@mastodon.online]]></dc:creator><pubDate>Thu, 14 May 2026 19:43:37 GMT</pubDate></item><item><title><![CDATA[Reply to My manager is going to open a team discussion about #AI &#x2F; #LLM use for software development. on Thu, 14 May 2026 19:43:03 GMT]]></title><description><![CDATA[<p><span><a href="/user/shadow53%40floss.social">@<span>shadow53</span></a></span> <a href="https://www.anthropic.com/research/AI-assistance-coding-skills" rel="nofollow noopener"><span>https://www.</span><span>anthropic.com/research/AI-assi</span><span>stance-coding-skills</span></a></p>]]></description><link>https://board.circlewithadot.net/post/https://social.treehouse.systems/users/fundamental/statuses/116574636203371979</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://social.treehouse.systems/users/fundamental/statuses/116574636203371979</guid><dc:creator><![CDATA[fundamental@social.treehouse.systems]]></dc:creator><pubDate>Thu, 14 May 2026 19:43:03 GMT</pubDate></item><item><title><![CDATA[Reply to My manager is going to open a team discussion about #AI &#x2F; #LLM use for software development. on Thu, 14 May 2026 19:36:03 GMT]]></title><description><![CDATA[<p><span><a href="/user/shadow53%40floss.social">@<span>shadow53</span></a></span> cc: <span><a href="/user/baldur%40toot.cafe">@<span>baldur</span></a></span> </p><p>He has <a href="https://needtoknow.fyi/" rel="nofollow noopener"><span>https://</span><span>needtoknow.fyi/</span><span></span></a> which might be handy.</p>]]></description><link>https://board.circlewithadot.net/post/https://floss.social/users/alcinnz/statuses/116574608640237106</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://floss.social/users/alcinnz/statuses/116574608640237106</guid><dc:creator><![CDATA[alcinnz@floss.social]]></dc:creator><pubDate>Thu, 14 May 2026 19:36:03 GMT</pubDate></item><item><title><![CDATA[Reply to My manager is going to open a team discussion about #AI &#x2F; #LLM use for software development. on Thu, 14 May 2026 19:26:11 GMT]]></title><description><![CDATA[<p><span><a href="/user/shadow53%40floss.social">@<span>shadow53</span></a></span> Then, there is the cost risk - all of them make a big loss at the moment (<a href="https://www.wheresyoured.at/premium-ais-circular-psychosis/" rel="nofollow noopener"><span>https://www.</span><span>wheresyoured.at/premium-ais-ci</span><span>rcular-psychosis/</span></a>). This means you will in the very near future pay 100-1000x more money. On top of that you will spend even more money because the more code you generate the more token you have to spend in the future for AI to go through your code.<br />The cost risk is real: <a href="https://finance.yahoo.com/sectors/technology/articles/ubers-anthropic-ai-push-hits-223109852.html" rel="nofollow noopener"><span>https://</span><span>finance.yahoo.com/sectors/tech</span><span>nology/articles/ubers-anthropic-ai-push-hits-223109852.html</span></a></p>]]></description><link>https://board.circlewithadot.net/post/https://social.anoxinon.de/ap/users/115989705188194340/statuses/116574569844151470</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://social.anoxinon.de/ap/users/115989705188194340/statuses/116574569844151470</guid><dc:creator><![CDATA[jornfranke@social.anoxinon.de]]></dc:creator><pubDate>Thu, 14 May 2026 19:26:11 GMT</pubDate></item><item><title><![CDATA[Reply to My manager is going to open a team discussion about #AI &#x2F; #LLM use for software development. on Thu, 14 May 2026 19:20:41 GMT]]></title><description><![CDATA[<p><span><a href="/user/shadow53%40floss.social">@<span>shadow53</span></a></span> I would ask to measure that it indeed brings improvement and have additional validation (unit testing, static code analysis, fuzzing etc.) that it does not decrease quality. Static code analyzers give you also statistics on duplicated code and code complexity. They can be indicator that things go wrong. Additionally, you can collect number of incidents (if LLM would work they should go down).</p>]]></description><link>https://board.circlewithadot.net/post/https://social.anoxinon.de/ap/users/115989705188194340/statuses/116574548262134082</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://social.anoxinon.de/ap/users/115989705188194340/statuses/116574548262134082</guid><dc:creator><![CDATA[jornfranke@social.anoxinon.de]]></dc:creator><pubDate>Thu, 14 May 2026 19:20:41 GMT</pubDate></item></channel></rss>