@foolishowl I disagree with much of his analysis, but he mentions Ollama for running local models. I’ve found it easy to setup as well, though any models you can run locally currently aren’t as “capable” as the cloud-hosted models. But the gap is narrowing for sure
basetwojesus@mstdn.social
@basetwojesus@mstdn.social
Posts
-
Submitted without commentary: "AI Might Be Our Best Shot At Taking Back The Open Web" by Mike Masnick https://www.techdirt.com/2026/03/25/ai-might-be-our-best-shot-at-taking-back-the-open-web/ -
Submitted without commentary: "AI Might Be Our Best Shot At Taking Back The Open Web" by Mike Masnick https://www.techdirt.com/2026/03/25/ai-might-be-our-best-shot-at-taking-back-the-open-web/@rickpelletier I’m unfamiliar with how Cursor works. Do they provide their own LLM or do you have to provide e.g. a Claude API key or something? Or is it using a local model?