drio
Open app

MCP Server for Ollama

Source

MCP server for connecting Claude Desktop to Ollama LLM server

Catalog onlyCatalog onlySTDIO

Overview

MCP Server for Ollama is a Model Control Protocol server that facilitates communication between Claude Desktop and the Ollama LLM server.

To use the MCP Server, clone the repository, configure the environment variables, install the necessary dependencies, and run the server using Python or Docker.

  • Enables seamless communication between Claude Desktop and Ollama LLM server. - Supports both Python and Docker setups for flexibility. - Easy configuration through environment variables and JSON files.
  1. Integrating AI models with desktop applications.
  2. Facilitating real-time communication between different AI systems.
  3. Enhancing the functionality of Claude Desktop with Ollama's capabilities.

Add to your AI client

Use these steps to connect MCP Server for Ollama in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-server-ollama-vincentf305": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-vincentf305"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-server-ollama-vincentf305": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-vincentf305"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-server-ollama-vincentf305": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-vincentf305"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-server-ollama-vincentf305": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-vincentf305"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-server-ollama-vincentf305": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-vincentf305"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-server-ollama-vincentf305": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-vincentf305"
      ]
    }
  }
}

FAQ

What is the purpose of the MCP Server?

The MCP Server allows Claude Desktop to communicate effectively with the Ollama LLM server, enabling enhanced AI functionalities.

Is there a Docker setup available?

Yes! The MCP Server can be built and run using Docker for easier deployment.

How do I configure the server?

Configuration is done through the `.env` file and the `claude_desktop_config.json` file.