From frustration to fascination.
Uncategorized
3
Posts
3
Posters
0
Views
-
From frustration to fascination. llama.cpp makes local AI fun again!
I Switched From Ollama And LM Studio To llama.cpp And Absolutely Loving It
Like Ollama, I can use a feature-rich CLI, plus Vulkan support in llama.cpp and it takes a lot less disk space, too.
It's FOSS (itsfoss.com)
-
From frustration to fascination. llama.cpp makes local AI fun again!
I Switched From Ollama And LM Studio To llama.cpp And Absolutely Loving It
Like Ollama, I can use a feature-rich CLI, plus Vulkan support in llama.cpp and it takes a lot less disk space, too.
It's FOSS (itsfoss.com)
@itsfoss When was AI ever fun?
-
From frustration to fascination. llama.cpp makes local AI fun again!
I Switched From Ollama And LM Studio To llama.cpp And Absolutely Loving It
Like Ollama, I can use a feature-rich CLI, plus Vulkan support in llama.cpp and it takes a lot less disk space, too.
It's FOSS (itsfoss.com)
@itsfoss pushing AI slop once again, wonderful *not*
-
P pixelate@tweesecake.social shared this topic