MCP server wrapping local Ollama models for offload from API-priced orchestrators. Nine stdio tools - generation, summarisation, analysis, drafting, code tasks (docstring/test/explain/review/types/refactor-suggest), diff-driven tasks (commit-message/pr-description/changelog/summary/impact), mechanical transforms, and model management (list/pull). Apache-2.0.
Partial audit · 2/5 dimensions · Audit v1-github · today · imported from glama ai
Tool details coming soon. This server has 0 tools available.
MCP server wrapping local Ollama models for offload from API-priced orchestrators. Nine stdio tools - generation, summarisation, analysis, drafting, code tasks (docstring/test/explain/review/types/refactor-suggest), diff-driven tasks (commit-message/pr-description/changelog/summary/impact), mechanical transforms, and model management (list/pull). Apache-2.0.
Yes, mcp-ollama is completely free to use with no usage limits on the free tier.
mcp-ollama is listed under the Communication category in the AgentForge MCP registry.
mcp-ollama has a current uptime of 99.9% with an average response time of 0ms.
To connect mcp-ollama, click the "Connect Agent" button on this page to get the configuration snippet. Add it to your MCP client (Claude Desktop, Cursor, or any MCP-compatible tool). Your AI agent will then have access to all of mcp-ollama's tools via the Model Context Protocol.