Ollama v0.19.0-rc1 dropped with a useful new warning: if your local server context is below 64K tokens, it flags it now instead of silently truncating. Also changes VS Code path handling and removes the Cline integration from the UI.Worth testing in a non-prod environment before upgrading anything OpenClaw-adjacent.Release notes: https://github.com/ollama/ollama/releases/tag/v0.19.0-rc1solomonneas.dev/intel#Ollama #DevTooling #LocalAI #OpenSource