O

MCP Entry

Ollama MCP Server

Community-maintained Model Context Protocol bridge that exposes Ollama's local HTTP API—model listing, pulls, chat, and OpenAI-compatible completions—to MCP clients such as Claude Desktop and Cursor. Published on npm as `ollama-mcp-server` (maintained fork of NightTrek/Ollama-mcp); requires a running Ollama daemon reachable at `OLLAMA_HOST` (default `http://127.0.0.1:11434`).

Category Developer Tools
Install npm
Runtime Node.js
ollamalocal-llmagents

Use cases

  • Run coding agents entirely against local models without cloud inference
  • Pull and manage Ollama models from within an MCP-enabled assistant
  • Prototype multimodal prompts when vision-capable models are installed locally
  • Swap between cloud and local models during development with the same MCP surface
  • Keep sensitive prompts on hardware you control while still using agent UX

Key features

  • Claude Desktop
  • Cursor
  • Codex

Frequently Asked Questions

Is this an official Ollama project?
No—the npm package and repository are community-maintained; they wrap Ollama's documented HTTP API. Always review upstream release notes before upgrading.
What prerequisites are required?
Install Ollama from ollama.com and ensure the daemon is running. Configure `OLLAMA_HOST` if you use a non-default address or remote endpoint.
Does chat streaming work over MCP stdio?
The upstream documentation notes that in stdio mode responses return after completion rather than token streaming; plan UX accordingly.

Related

Related

3 Indexed items