Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

tomgag@infosec.exchangeT

tomgag@infosec.exchange

@tomgag@infosec.exchange
About
Posts
5
Topics
2
Shares
0
Groups
0
Followers
0
Following
0

View Original

Posts

Recent Best Controversial

  • I'm no particular fan of Anthropic, but seeing some spine in this timeline is... refreshing for once.
    tomgag@infosec.exchangeT tomgag@infosec.exchange

    I'm no particular fan of Anthropic, but seeing some spine in this timeline is... refreshing for once.

    Link Preview Image
    Statement from Dario Amodei on our discussions with the Department of War

    A statement from our CEO on national security uses of AI

    favicon

    (www.anthropic.com)

    #anthropic #ai #darioamodei #hegseth #pentagon #trump #usa #politics

    Uncategorized anthropic darioamodei hegseth pentagon

  • Going into the rabbithole of testing local LLMs right now.
    tomgag@infosec.exchangeT tomgag@infosec.exchange

    First impressions of Mistral Small 3.2: seems pretty solid, it answers "uncomfortable" political question quite neutrally.

    I don't understand why #confer and #euria by #infomaniak are not based on this.

    Uncategorized huggingface selfhost localai ollama

  • Going into the rabbithole of testing local LLMs right now.
    tomgag@infosec.exchangeT tomgag@infosec.exchange

    Heretic quantized versions of Qwen 3.5 have just been released but even the base Qwen 3.5 model seems to have issue with ollama currently, and I don't have bandwidth to do a manual patch now. Trying Mistral 3.2.

    Uncategorized huggingface selfhost localai ollama

  • Going into the rabbithole of testing local LLMs right now.
    tomgag@infosec.exchangeT tomgag@infosec.exchange

    Going into the rabbithole of testing local LLMs right now. I don't have a dedicated GPU, but 32 GiB of RAM should be enough for anyone.

    #ai #huggingface #selfhost #localai #ollama #heretic #qwen #mistral

    Uncategorized huggingface selfhost localai ollama

  • You know us: at Dyne, we've never been the type to spy on our own people.
    tomgag@infosec.exchangeT tomgag@infosec.exchange

    @dyne nooo, don't go to Matrix! https://gagliardoni.net/#im_battle_2025

    Uncategorized
  • Login

  • Login or register to search.
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups