<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[We knew, but the proof is nice.]]></title><description><![CDATA[<p>We knew, but the proof is nice. </p><p>"Apple just proved that AI models cannot do math. Not advanced math. Grade school math. The kind a 10-year-old solves" </p><p>The guess-the-next-words machines don’t actually understand anything. </p><p></p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">

<div class="card-body">
<h5 class="card-title">
<a href="https://nitter.poast.org/heynavtoor/status/2041243558833987600">
Verifying your browser | Nitter
</a>
</h5>
<p class="card-text line-clamp-3"></p>
</div>
<a href="https://nitter.poast.org/heynavtoor/status/2041243558833987600" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://nitter.poast.org/favicon.ico" alt="favicon" class="not-responsive overflow-hiddden" style="max-width:21px;max-height:21px" />



<p class="d-inline-block text-truncate mb-0"> <span class="text-secondary">(nitter.poast.org)</span></p>
</a>
</div><p></p><p><a href="https://mastodon.online/tags/math" rel="tag">#<span>math</span></a> <a href="https://mastodon.online/tags/ai" rel="tag">#<span>ai</span></a></p>]]></description><link>https://board.circlewithadot.net/topic/6729f050-8780-48ad-a7a9-f189ba5bb6b8/we-knew-but-the-proof-is-nice.</link><generator>RSS for Node</generator><lastBuildDate>Thu, 09 Apr 2026 09:59:46 GMT</lastBuildDate><atom:link href="https://board.circlewithadot.net/topic/6729f050-8780-48ad-a7a9-f189ba5bb6b8.rss" rel="self" type="application/rss+xml"/><pubDate>Tue, 07 Apr 2026 20:03:11 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 20:33:03 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> When did they do this test? I tried it with the following LLMs: Sonnet 4.6, Codex 5.3, GPT-5.4, GPT-5-Mini and Kimi-K2.5. They all answer the kiwi question correctly.</p>]]></description><link>https://board.circlewithadot.net/post/https://mas.to/users/erwinrossen/statuses/116370989629985070</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mas.to/users/erwinrossen/statuses/116370989629985070</guid><dc:creator><![CDATA[erwinrossen@mas.to]]></dc:creator><pubDate>Wed, 08 Apr 2026 20:33:03 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 20:29:06 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> Of course an LLM cannot do math, but to be honest, that is also not what they're designed for. An LLM these days like Claude knows that it should take a calculator and type the equation in there, instead of hallucinating an answer. Complaining that an LLM can't do math is like complaining a screwdriver can't drill a hole. </p><p>You can counter that there are plenty of people who are using the screwdriver to drill the hole, but that is not on the tool, that is on the user.</p>]]></description><link>https://board.circlewithadot.net/post/https://mas.to/users/erwinrossen/statuses/116370974080787749</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mas.to/users/erwinrossen/statuses/116370974080787749</guid><dc:creator><![CDATA[erwinrossen@mas.to]]></dc:creator><pubDate>Wed, 08 Apr 2026 20:29:06 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 19:20:19 GMT]]></title><description><![CDATA[<p><span><a href="https://aus.social/@drifthood">@<span>drifthood</span></a></span> <span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> This makes me think of "Clever Hans", the horse that appeared to do arithmetics but actually just responded to involuntary human cues:<br /><a href="https://en.wikipedia.org/wiki/Clever_Hans" rel="nofollow noopener"><span>https://</span><span>en.wikipedia.org/wiki/Clever_H</span><span>ans</span></a></p>]]></description><link>https://board.circlewithadot.net/post/https://androiddev.social/users/bladecoder/statuses/116370703633340298</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://androiddev.social/users/bladecoder/statuses/116370703633340298</guid><dc:creator><![CDATA[bladecoder@androiddev.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 19:20:19 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 18:11:10 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online" rel="nofollow noopener">@<span>davidaugust</span></a></span> </p><p>October 2024</p><p></p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">



<a href="https://www.wired.com/story/apple-ai-llm-reasoning-research/" title="Apple Engineers Show How Flimsy AI ‘Reasoning’ Can Be">
<img src="https://media.wired.com/photos/670eb29ca88ac76b34d258c1/191:100/w_1280,c_limit/apple-llm-fail-biz.jpg" class="card-img-top not-responsive" style="max-height:15rem" alt="Link Preview Image" />
</a>



<div class="card-body">
<h5 class="card-title">
<a href="https://www.wired.com/story/apple-ai-llm-reasoning-research/">
Apple Engineers Show How Flimsy AI ‘Reasoning’ Can Be
</a>
</h5>
<p class="card-text line-clamp-3">The new frontier in large language models is the ability to “reason” their way through problems. New research from Apple says it's not quite what it's cracked up to be.</p>
</div>
<a href="https://www.wired.com/story/apple-ai-llm-reasoning-research/" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://www.wired.com/verso/static/wired-us/assets/favicon.ico" alt="favicon" class="not-responsive overflow-hiddden" style="max-width:21px;max-height:21px" />







<p class="d-inline-block text-truncate mb-0">WIRED <span class="text-secondary">(www.wired.com)</span></p>
</a>
</div><p></p>]]></description><link>https://board.circlewithadot.net/post/https://infosec.exchange/users/joriki/statuses/116370431723039208</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://infosec.exchange/users/joriki/statuses/116370431723039208</guid><dc:creator><![CDATA[joriki@infosec.exchange]]></dc:creator><pubDate>Wed, 08 Apr 2026 18:11:10 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 17:52:59 GMT]]></title><description><![CDATA[<p><span><a href="/user/flq%40freiburg.social">@<span>flq</span></a></span> yes, many systems have tools and/or abilities built in to take over basic math operations that simpler LLMs failed at.</p><p>The salient and enduring issue, I think, is that the spin and marketing of LLMs as "understanding," "thinking" or "intelligent" (as those words typical meanings suggest) remains largely fictional.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116370360217321541</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116370360217321541</guid><dc:creator><![CDATA[davidaugust@mastodon.online]]></dc:creator><pubDate>Wed, 08 Apr 2026 17:52:59 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 17:50:51 GMT]]></title><description><![CDATA[<p><span><a href="/user/pascal_le_merrer%40mastodon.social">@<span>pascal_le_merrer</span></a></span> any day now. I hear potus say in two weeks.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116370351813348209</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116370351813348209</guid><dc:creator><![CDATA[davidaugust@mastodon.online]]></dc:creator><pubDate>Wed, 08 Apr 2026 17:50:51 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 17:49:38 GMT]]></title><description><![CDATA[<p><span><a href="/user/audioflyer79%40mstdn.social">@<span>audioflyer79</span></a></span> <span><a href="/user/alisynthesis%40io.waxandleather.com">@<span>alisynthesis</span></a></span> <span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> how does it do if you swap the colors of the fruit?</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.gamedev.place/users/morten_skaaning/statuses/116370347049501908</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.gamedev.place/users/morten_skaaning/statuses/116370347049501908</guid><dc:creator><![CDATA[morten_skaaning@mastodon.gamedev.place]]></dc:creator><pubDate>Wed, 08 Apr 2026 17:49:38 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 15:58:22 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> </p><p>Shortcut to paper: <a href="https://arxiv.org/pdf/2410.05229" rel="nofollow noopener"><span>https://</span><span>arxiv.org/pdf/2410.05229</span><span></span></a></p>]]></description><link>https://board.circlewithadot.net/post/https://fed.qaz.red/users/elithebearded/statuses/116369909531688068</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://fed.qaz.red/users/elithebearded/statuses/116369909531688068</guid><dc:creator><![CDATA[elithebearded@fed.qaz.red]]></dc:creator><pubDate>Wed, 08 Apr 2026 15:58:22 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 15:34:25 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> <span><a href="/user/scottjenson%40social.coop">@<span>scottjenson</span></a></span> <span><a href="/user/xdydx%40mastodon.social">@<span>xdydx</span></a></span> </p><p>True. See <span><a href="/user/xdydx%40mastodon.social">@<span>xdydx</span></a></span> 's reply.</p>]]></description><link>https://board.circlewithadot.net/post/https://hachyderm.io/users/glitzersachen/statuses/116369815356257268</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://hachyderm.io/users/glitzersachen/statuses/116369815356257268</guid><dc:creator><![CDATA[glitzersachen@hachyderm.io]]></dc:creator><pubDate>Wed, 08 Apr 2026 15:34:25 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 14:56:04 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> interesting. Had to ask. Already fixed?</p>

<div class="row mt-3"><div class="col-12 mt-3"><img class="img-thumbnail" src="https://freiburg.social/system/media_attachments/files/116/369/664/267/248/878/original/890bef74834df294.png" alt="Link Preview Image" /></div></div>]]></description><link>https://board.circlewithadot.net/post/https://freiburg.social/users/flq/statuses/116369664569446078</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://freiburg.social/users/flq/statuses/116369664569446078</guid><dc:creator><![CDATA[flq@freiburg.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 14:56:04 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 14:29:00 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> AGI is coming son 🤭</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/pascal_le_merrer/statuses/116369558094072452</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/pascal_le_merrer/statuses/116369558094072452</guid><dc:creator><![CDATA[pascal_le_merrer@mastodon.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 14:29:00 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 13:48:58 GMT]]></title><description><![CDATA[<p><span><a href="/user/karen5lund%40mastodon.social">@<span>Karen5Lund</span></a></span> Maybe because people stopped writing efficient code about 20 years ago?</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/bouriquet/statuses/116369400672671524</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/bouriquet/statuses/116369400672671524</guid><dc:creator><![CDATA[bouriquet@mastodon.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 13:48:58 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 13:17:17 GMT]]></title><description><![CDATA[<p><span><a href="/user/lemgandi%40mastodon.social">@<span>lemgandi</span></a></span><br />The wetness of water has been hotly debated, as to some wet means "covered with or soaked in water", and it's questioned whether water is covered with itself.<br /><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span></p>]]></description><link>https://board.circlewithadot.net/post/https://mstdn.social/users/ozzelot/statuses/116369276105517171</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mstdn.social/users/ozzelot/statuses/116369276105517171</guid><dc:creator><![CDATA[ozzelot@mstdn.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 13:17:17 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 13:15:04 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> </p><p></p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">

<div class="card-body">
<h5 class="card-title">
<a href="https://mastodon.me.uk/@pikesley/114705903661841870">
Amo Bishop Rodent (@pikesley@mastodon.me.uk)
</a>
</h5>
<p class="card-text line-clamp-3">"We made the computers, the notoriously accurate calculating machines, worse at arithmetic. This is surely progress along the path to creating Computer God"</p>
</div>
<a href="https://mastodon.me.uk/@pikesley/114705903661841870" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://mastodon.me.uk/packs/assets/favicon-16x16-74JBPGmr.png" alt="favicon" class="not-responsive overflow-hiddden" style="max-width:21px;max-height:21px" />





























<p class="d-inline-block text-truncate mb-0">mastodon.me.uk <span class="text-secondary">(mastodon.me.uk)</span></p>
</a>
</div><p></p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.me.uk/users/pikesley/statuses/116369267373253824</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.me.uk/users/pikesley/statuses/116369267373253824</guid><dc:creator><![CDATA[pikesley@mastodon.me.uk]]></dc:creator><pubDate>Wed, 08 Apr 2026 13:15:04 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 12:55:12 GMT]]></title><description><![CDATA[<p><span><a href="/user/alisynthesis%40io.waxandleather.com">@<span>alisynthesis</span></a></span> <span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> fair enough. I changed up the problem completely and added some reasoning and it did pretty well. It appears to be generating code to solve the math. The only thing it missed is that very unripe bananas are green, not yellow. </p><p>James picks 40 apples on Monday. Then he picks 35 lemons on Tuesday. On Wednesday, he picks half as many bananas as he did apples, but five of them were very unripe. How many yellow fruits does James have?</p>

<div class="row mt-3"><div class="col-12 mt-3"><img class="img-thumbnail" src="https://media.mstdn.social/media_attachments/files/116/369/187/556/691/564/original/7557490c64462f15.png" alt="Link Preview Image" /><img class="img-thumbnail" src="https://media.mstdn.social/media_attachments/files/116/369/188/931/400/281/original/df2678a8ab74c059.png" alt="Link Preview Image" /></div></div>]]></description><link>https://board.circlewithadot.net/post/https://mstdn.social/users/audioflyer79/statuses/116369189293853840</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mstdn.social/users/audioflyer79/statuses/116369189293853840</guid><dc:creator><![CDATA[audioflyer79@mstdn.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 12:55:12 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 12:43:34 GMT]]></title><description><![CDATA[<p><span><a href="/user/audioflyer79%40mstdn.social">@<span>audioflyer79</span></a></span> <span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> I mean, it's worth noting that the LLMs have ingested that paper by now. : /</p>]]></description><link>https://board.circlewithadot.net/post/https://io.waxandleather.com/users/alisynthesis/statuses/116369143527661164</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://io.waxandleather.com/users/alisynthesis/statuses/116369143527661164</guid><dc:creator><![CDATA[alisynthesis@io.waxandleather.com]]></dc:creator><pubDate>Wed, 08 Apr 2026 12:43:34 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 12:31:51 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> Ecosia AI gets it right. It looks like the paper referenced was published in 2025, so the research conducted prior. The models are all much better now. I’m no AI apologist, but I think any argument of “AI sucks because it’s not good at _____” is on tenuous ground and will be proven wrong as the models continue to improve. @Ecosia</p>

<div class="row mt-3"><div class="col-12 mt-3"><img class="img-thumbnail" src="https://media.mstdn.social/media_attachments/files/116/369/095/249/817/591/original/f34ae6501b3e5f02.png" alt="Link Preview Image" /></div></div>]]></description><link>https://board.circlewithadot.net/post/https://mstdn.social/users/audioflyer79/statuses/116369097458043474</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mstdn.social/users/audioflyer79/statuses/116369097458043474</guid><dc:creator><![CDATA[audioflyer79@mstdn.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 12:31:51 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 12:20:47 GMT]]></title><description><![CDATA[<p><span><a href="/user/joriki%40infosec.exchange">@<span>joriki</span></a></span> it’s from August.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116369053962251543</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116369053962251543</guid><dc:creator><![CDATA[davidaugust@mastodon.online]]></dc:creator><pubDate>Wed, 08 Apr 2026 12:20:47 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 12:16:57 GMT]]></title><description><![CDATA[<p><span><a href="https://aus.social/@drifthood">@<span>drifthood</span></a></span> yes, there does seem to be a threshold over which in some respects only humans cross over to one side. </p><p>I see that sort of begging in a dog. He wants the treat, so instead of just doing the desired behavior the human command is asking for, he tries every response that has ever gotten him a treat until he “unlocks” the treat. Humans can and do do this too from time to time, but humans _also_ actually communicate and understand from time to time as well.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116369038868828654</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116369038868828654</guid><dc:creator><![CDATA[davidaugust@mastodon.online]]></dc:creator><pubDate>Wed, 08 Apr 2026 12:16:57 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 12:12:06 GMT]]></title><description><![CDATA[<p><span><a href="/user/sobex%40social.sciences.re">@<span>Sobex</span></a></span> it’s from August.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116369019819477369</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.online/users/davidaugust/statuses/116369019819477369</guid><dc:creator><![CDATA[davidaugust@mastodon.online]]></dc:creator><pubDate>Wed, 08 Apr 2026 12:12:06 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 09:22:07 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> In about 80 years we've gone from a room full of computers the size of refrigerators that were good at crunching numbers but not much else to computers the size of corporate office parks that can draw almost-convincing pictures of people with five fingers (and thumbs, too!) but can't do elementary school math. </p><p>And some people call this progress.</p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/Karen5Lund/statuses/116368351371775536</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/Karen5Lund/statuses/116368351371775536</guid><dc:creator><![CDATA[karen5lund@mastodon.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 09:22:07 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 08:22:55 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> Direct link to the paper <a href="https://arxiv.org/pdf/2410.05229" rel="nofollow noopener"><span>https://</span><span>arxiv.org/pdf/2410.05229</span><span></span></a> (presented at ICLR 2025).</p><p>Seems not to be a very recent news, then.</p>]]></description><link>https://board.circlewithadot.net/post/https://social.sciences.re/users/Sobex/statuses/116368118601029127</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://social.sciences.re/users/Sobex/statuses/116368118601029127</guid><dc:creator><![CDATA[sobex@social.sciences.re]]></dc:creator><pubDate>Wed, 08 Apr 2026 08:22:55 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 07:53:11 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> Well, there have actually been successes by connecting LLMs to proof assistant and computer algebra programs. As this post rightly puts, the LLM is not capable in itself to perform computations reliably, but it can write commands sent to the computer algebra programs, or proof candidates sent to the proof assistant; which can answer that the proof is incorrect, and the process goes on until a correct proof is produced.</p><p>See also uses by pro mathematicians:<br /><a href="https://bsky.app/profile/wildverzweigt.bsky.social/post/3miua4ulxhk2f" rel="nofollow noopener"><span>https://</span><span>bsky.app/profile/wildverzweigt</span><span>.bsky.social/post/3miua4ulxhk2f</span></a></p><p>Also see Terence Tao</p>]]></description><link>https://board.circlewithadot.net/post/https://social.sciences.re/users/MonniauxD/statuses/116368001725262914</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://social.sciences.re/users/MonniauxD/statuses/116368001725262914</guid><dc:creator><![CDATA[monniauxd@social.sciences.re]]></dc:creator><pubDate>Wed, 08 Apr 2026 07:53:11 GMT</pubDate></item><item><title><![CDATA[Reply to We knew, but the proof is nice. on Wed, 08 Apr 2026 02:36:05 GMT]]></title><description><![CDATA[<p><span><a href="/user/davidaugust%40mastodon.online">@<span>davidaugust</span></a></span> <span><a href="/user/glitzersachen%40hachyderm.io">@<span>glitzersachen</span></a></span></p><p>Actually, this particular joke has the attention of quite a few people..</p><p><div class="card col-md-9 col-lg-6 position-relative link-preview p-0">

<div class="card-body">
<h5 class="card-title">
<a href="https://social.coop/@scottjenson/116358195717244835">
Scott Jenson (@scottjenson@social.coop)
</a>
</h5>
<p class="card-text line-clamp-3">OK, this is going even MORE sideways so I need to make a few things clear:
1. I took a complex point and made it poorly
2. My goal was to ask for more inclusiveness
3. I am sickened by what happend to BlackTwitter and I don't want it recur
4. But I can't speak for BlackTwitter nor should I
5. I apologize to black mastodon users for making such a poor comparison
6. I'm not endorsing "AI Slop" they were a foil to make my point
7. I'm certainly NOT trying to compare AI bros to Black twitter (but, as I said, I can see how people made that connection. I'm trying to correct that here)</p>
</div>
<a href="https://social.coop/@scottjenson/116358195717244835" class="card-footer text-body-secondary small d-flex gap-2 align-items-center lh-2">



<img src="https://social-coop-media.ams3.cdn.digitaloceanspaces.com/site_uploads/files/000/000/003/16/b7b85a87a7301812.png" alt="favicon" class="not-responsive overflow-hiddden" style="max-width: 21px; max-height: 21px;" />





























<p class="d-inline-block text-truncate mb-0">social.coop <span class="text-secondary">(social.coop)</span></p>
</a>
</div></p><p> <span><a href="/user/scottjenson%40social.coop">@<span>scottjenson</span></a></span></p>]]></description><link>https://board.circlewithadot.net/post/https://mastodon.social/users/xdydx/statuses/116366754793331832</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://mastodon.social/users/xdydx/statuses/116366754793331832</guid><dc:creator><![CDATA[xdydx@mastodon.social]]></dc:creator><pubDate>Wed, 08 Apr 2026 02:36:05 GMT</pubDate></item></channel></rss>