drio
Open app

Ollama MCP Server

Source

An MCP Server for Ollama

Catalog onlyCatalog onlySTDIO

Overview

Ollama MCP Server is a Model Context Protocol (MCP) server designed for Ollama, facilitating seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.

To use the Ollama MCP Server, install it via npm or Smithery, start the server, and configure it in your MCP-compatible application settings.

  • List available Ollama models - Pull new models from Ollama - Chat with models using Ollama's chat API - Get detailed model information - Automatic port management - Environment variable configuration
  1. Integrating local LLM models with applications like Claude Desktop.
  2. Managing and interacting with multiple Ollama models.
  3. Facilitating communication between different MCP-compatible applications.

Add to your AI client

Use these steps to connect Ollama MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "ollama-mcp-rawveg": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-ollama-mcp-rawveg"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "ollama-mcp-rawveg": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-ollama-mcp-rawveg"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "ollama-mcp-rawveg": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-ollama-mcp-rawveg"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "ollama-mcp-rawveg": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-ollama-mcp-rawveg"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "ollama-mcp-rawveg": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-ollama-mcp-rawveg"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "ollama-mcp-rawveg": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-ollama-mcp-rawveg"
      ]
    }
  }
}

FAQ

What are the prerequisites for using Ollama MCP Server?

You need Node.js (v16 or higher), npm, and Ollama installed and running locally.

How do I start the server?

Run the command `ollama-mcp` in your terminal.

Can I change the default port?

Yes, you can specify a different port using the PORT environment variable.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":1425,"uuid":"51f491d8-5aa9-4ea7-b9e2-3eae420f5e78","name":"ollama-mcp","title":"Ollama MCP Server","description":"An MCP Server for Ollama","avatar_url":"https://avatars.githubusercontent.com/u/308889?v=4","created_at":"2025-02-20T15:44:28.232Z","updated_at":"2025-02-23T07:21:39.463Z","status":"created","author_name":"rawveg","author_avatar_url":"https://avatars.githubusercontent.com/u/308889?v=4","tags":"[]","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/rawveg/ollama-mcp","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"47\",\"license\":\"AGPL-3.0 license\",\"language\":\"TypeScript\",\"is_official\":false,\"latest_commit_time\":\"2025-04-20 20:58:29\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]