<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Going into the rabbithole of testing local LLMs right now.]]></title><description><![CDATA[<p>Going into the rabbithole of testing local LLMs right now. I don't have a dedicated GPU, but 32 GiB of RAM should be enough for anyone.</p><p><a href="https://infosec.exchange/tags/ai" rel="tag">#<span>ai</span></a> <a href="https://infosec.exchange/tags/huggingface" rel="tag">#<span>huggingface</span></a> <a href="https://infosec.exchange/tags/selfhost" rel="tag">#<span>selfhost</span></a> <a href="https://infosec.exchange/tags/localai" rel="tag">#<span>localai</span></a> <a href="https://infosec.exchange/tags/ollama" rel="tag">#<span>ollama</span></a> <a href="https://infosec.exchange/tags/heretic" rel="tag">#<span>heretic</span></a> <a href="https://infosec.exchange/tags/qwen" rel="tag">#<span>qwen</span></a> <a href="https://infosec.exchange/tags/mistral" rel="tag">#<span>mistral</span></a></p>]]></description><link>https://board.circlewithadot.net/topic/fab6c79a-2d1c-4def-ad61-47946437277a/going-into-the-rabbithole-of-testing-local-llms-right-now.</link><generator>RSS for Node</generator><lastBuildDate>Thu, 16 Apr 2026 20:24:54 GMT</lastBuildDate><atom:link href="https://board.circlewithadot.net/topic/fab6c79a-2d1c-4def-ad61-47946437277a.rss" rel="self" type="application/rss+xml"/><pubDate>Thu, 26 Feb 2026 09:52:19 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Fri, 27 Feb 2026 18:22:38 GMT]]></title><description><![CDATA[<p><span><a href="/user/tomgag%40infosec.exchange">@<span>tomgag</span></a></span> <br />Good question! Why is <a href="https://dresden.network/tags/infomaniak" rel="tag">#<span>infomaniak</span></a> not part of the fediverse?!</p>]]></description><link>https://board.circlewithadot.net/post/https://dresden.network/users/blingblingmk/statuses/116143984398848269</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://dresden.network/users/blingblingmk/statuses/116143984398848269</guid><dc:creator><![CDATA[blingblingmk@dresden.network]]></dc:creator><pubDate>Fri, 27 Feb 2026 18:22:38 GMT</pubDate></item><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Thu, 26 Feb 2026 11:01:10 GMT]]></title><description><![CDATA[<p>Interesting, it seems that Qwen 2.5 Coder is actually less aggressive than Qwen 3.5 in rejecting sensitive topics.</p>

<div class="row mt-3"><div class="col-12 mt-3"><img class="img-thumbnail" src="https://media.infosec.exchange/infosec.exchange/media_attachments/files/116/136/585/450/107/128/original/921b30cb557870a9.jpg" alt="Link Preview Image" /></div></div>]]></description><link>https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136586147803776</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136586147803776</guid><dc:creator><![CDATA[tomgag@infosec.exchange]]></dc:creator><pubDate>Thu, 26 Feb 2026 11:01:10 GMT</pubDate></item><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Thu, 26 Feb 2026 10:57:55 GMT]]></title><description><![CDATA[<p><span><a href="https://mostr.pub/users/1ad6e959c292f74de615d4c6e5ec43d0b7ec4908a55de93aa2527c46a8bd1d5b" rel="nofollow noopener">@<span>1ad6e959c292f74de615d4c6e5ec43d0b7ec4908a55de93aa2527c46a8bd1d5b</span></a></span> I'm not sure, I don't have any beefy GPU <img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f605.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--sweat_smile" style="height:23px;width:auto;vertical-align:middle" title="😅" alt="😅" /> you shoulkd ask this in the Ollama Reddit community (or similar).</p>]]></description><link>https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136573383282289</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136573383282289</guid><dc:creator><![CDATA[tomgag@infosec.exchange]]></dc:creator><pubDate>Thu, 26 Feb 2026 10:57:55 GMT</pubDate></item><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Thu, 26 Feb 2026 10:51:57 GMT]]></title><description><![CDATA[<p><span><a href="/user/tomgag%40infosec.exchange">@<span>tomgag</span></a></span> maybe I’ll check I’m running on renewable energy before I leave a machine running over the weekend then <img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f923.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--rolling_on_the_floor_laughing" style="height:23px;width:auto;vertical-align:middle" title="🤣" alt="🤣" /></p>]]></description><link>https://board.circlewithadot.net/post/https://fosstodon.org/users/sealjay/statuses/116136549924150944</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://fosstodon.org/users/sealjay/statuses/116136549924150944</guid><dc:creator><![CDATA[sealjay@fosstodon.org]]></dc:creator><pubDate>Thu, 26 Feb 2026 10:51:57 GMT</pubDate></item><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Thu, 26 Feb 2026 10:50:55 GMT]]></title><description><![CDATA[<p><span><a href="/user/sealjay%40fosstodon.org" rel="nofollow noopener">@<span>sealjay</span></a></span> well, I'm running on local CPU with 32 GiB of RAM, so I wouldn't call it "fast". 3-5 tokens per second maybe? I guess it's OK if you give it a task and then go to grab a coffee <img src="https://board.circlewithadot.net/assets/plugins/nodebb-plugin-emoji/emoji/android/1f605.png?v=28325c671da" class="not-responsive emoji emoji-android emoji--sweat_smile" style="height:23px;width:auto;vertical-align:middle" title="😅" alt="😅" /></p>]]></description><link>https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136545845564113</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136545845564113</guid><dc:creator><![CDATA[tomgag@infosec.exchange]]></dc:creator><pubDate>Thu, 26 Feb 2026 10:50:55 GMT</pubDate></item><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Thu, 26 Feb 2026 10:44:08 GMT]]></title><description><![CDATA[<p><span><a href="/user/tomgag%40infosec.exchange">@<span>tomgag</span></a></span> how fast does it feel? I tried using foundry local and ollama but at the time I felt slowed down. I’d be keen to swap back to a local model given how the large providers are slowly catching down the subscription token limits.</p>]]></description><link>https://board.circlewithadot.net/post/https://fosstodon.org/users/sealjay/statuses/116136519182812009</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://fosstodon.org/users/sealjay/statuses/116136519182812009</guid><dc:creator><![CDATA[sealjay@fosstodon.org]]></dc:creator><pubDate>Thu, 26 Feb 2026 10:44:08 GMT</pubDate></item><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Thu, 26 Feb 2026 10:18:17 GMT]]></title><description><![CDATA[<p>First impressions of Mistral Small 3.2: seems pretty solid, it answers "uncomfortable" political question quite neutrally.</p><p>I don't understand why <a href="https://infosec.exchange/tags/confer" rel="tag">#<span>confer</span></a> and <a href="https://infosec.exchange/tags/euria" rel="tag">#<span>euria</span></a> by <a href="https://infosec.exchange/tags/infomaniak" rel="tag">#<span>infomaniak</span></a> are not based on this.</p>]]></description><link>https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136417545167228</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136417545167228</guid><dc:creator><![CDATA[tomgag@infosec.exchange]]></dc:creator><pubDate>Thu, 26 Feb 2026 10:18:17 GMT</pubDate></item><item><title><![CDATA[Reply to Going into the rabbithole of testing local LLMs right now. on Thu, 26 Feb 2026 09:53:49 GMT]]></title><description><![CDATA[<p>Heretic quantized versions of Qwen 3.5 have just been released but even the base Qwen 3.5 model seems to have issue with ollama currently, and I don't have bandwidth to do a manual patch now. Trying Mistral 3.2.</p>]]></description><link>https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136321299578974</link><guid isPermaLink="true">https://board.circlewithadot.net/post/https://infosec.exchange/users/tomgag/statuses/116136321299578974</guid><dc:creator><![CDATA[tomgag@infosec.exchange]]></dc:creator><pubDate>Thu, 26 Feb 2026 09:53:49 GMT</pubDate></item></channel></rss>