π New project: GitLab MCP Server
-
New project: GitLab MCP ServerPlug any AI assistant (Claude, Copilot, Cursorβ¦) into the entire GitLab API via Model Context Protocol.
1006 tools β full REST v4 + GraphQL
π§ 32 meta-tools to cut LLM token use
Read-only & safe (dry-run) modes
Single static binary β zero deps
οΈ Win/Linux/macOS Β· amd64 & arm64
Self-hosted GitLab + TLSWritten in Go from scratch.
https://github.com/jmrplens/gitlab-mcp-server
https://jmrplens.github.io/gitlab-mcp-server/ -
New project: GitLab MCP ServerPlug any AI assistant (Claude, Copilot, Cursorβ¦) into the entire GitLab API via Model Context Protocol.
1006 tools β full REST v4 + GraphQL
π§ 32 meta-tools to cut LLM token use
Read-only & safe (dry-run) modes
Single static binary β zero deps
οΈ Win/Linux/macOS Β· amd64 & arm64
Self-hosted GitLab + TLSWritten in Go from scratch.
https://github.com/jmrplens/gitlab-mcp-server
https://jmrplens.github.io/gitlab-mcp-server/1006 tools to poison AI prompt?

-
1006 tools to poison AI prompt?

@dr41d45 Fair point! That's exactly why meta-tools are the default mode

The LLM doesn't see 1006 tools β it sees 32 domain dispatchers (gitlab_issue, gitlab_mr, gitlab_pipelineβ¦) each routing to the actual op via an 'action' parameter.
Full API coverage, ~32 tool definitions of context cost. Individual mode is opt-in.
Check docs: https://jmrplens.github.io/gitlab-mcp-server/tools/meta-tools/
-
R relay@relay.infosec.exchange shared this topic