okay. there is an experimental llama.cpp flag "--webui-mcp-proxy", not to be used in production, which puts a proxy in front so the CORS interface (?) that the browser needs is available. you also have to check "use llama-server proxy" in the webui, and then it connects properly.#s0up