Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.
Partial audit · 3/5 dimensions · Audit v1.1-github-security · today · imported from awesome mcp
Työkalujen tiedot tulossa pian. Tällä palvelimella on 0 työkalua saatavilla.
Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.
Yes, mcp-server-ollama-bridge is completely free to use with no usage limits on the free tier.
mcp-server-ollama-bridge is listed under the developer-tools category in the AgentForge MCP registry.
mcp-server-ollama-bridge has a current uptime of 99.9% with an average response time of 0ms.
To connect mcp-server-ollama-bridge, click the "Connect Agent" button on this page to get the configuration snippet. Add it to your MCP client (Claude Desktop, Cursor, or any MCP-compatible tool). Your AI agent will then have access to all of mcp-server-ollama-bridge's tools via the Model Context Protocol.