Skip to content
  • 0 Votes
    1 Posts
    0 Views
    arint@arint.infoA
    RT @AMDRyzen: Starte den Hermes Agent von @NousResearch lokal auf AMD Ryzen AI Max-Prozessoren und Radeon GPUs mit @LMStudio. Autonome Workflows, on-device, volle Kontrolle. Los geht's: https://www.amd.com/en/blogs/2026/run-hermes-agent-locally-on-amd-ryzen-ai-max-processors-and-radeon-gpus.html mehr auf Arint.info #AMD #HermesAgent #LMStudio #OnDeviceAI #Radeon #RyzenAI #arint_info https://x.com/AMDRyzen/status/2047067299312476351#m
  • 0 Votes
    1 Posts
    1 Views
    arint@arint.infoA
    RT @Teknium: So viel in diesem Release, aber das eine, auf das viele am meisten gewartet haben, ist das GUI-Dashboard! Verwalten und überwachen Sie Ihren Hermes Agent mit einem lokalen Web-Dashboard mit GUI; starten Sie es mit dem Befehl hermes dashboard! Nous Research (@NousResearch) Hermes Agent v0.9.0 - „The Everywhere Release“ Vollständiges Changelog unten ↓ Video — https://nitter.net/NousResearch/status/2043770365369876979#m Mehr auf Arint.info #AI #GUI #HermesAgent #NousResearch #SoftwareUpdate #arint_info https://x.com/Teknium/status/2043771509123232230#m
  • Openclaw vs Hermes Agent

    Uncategorized openclaw hermesagent
    2
    0 Votes
    2 Posts
    4 Views
    mamba@mstdn.caM
    @arielf I've been working across the systems and trying to see where each is strongest.So far, I like Hermes the best. To sum it up? Less friction in using it and having it make changes to it's own configuration. Cron Jobs are a good example. It just new how to make more of them. OpenClaw would really struggle to troubleshoot on the same LLM models.Auto skill creation is also something I've found is genuinely useful. It burns a bunch of token trying to figure something out? I feel more confident it will remember the next time I need to do it. Openclaw was a coin flip.
  • 0 Votes
    3 Posts
    2 Views
    mamba@mstdn.caM
    @maurice Great tips Magnus!I've just begun experimenting with Obsidian integration and playing around with workflows. I would love to give it access to all my notes for RAG, but I can't stomach the idea of the data going out to a public LLM.