ollama-MCP-server
Overview
The ollama-MCP-server is a Model Context Protocol (MCP) server that facilitates seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management.
To use the ollama-MCP-server, install it via pip, configure your environment variables, and run the server. You can then interact with it using various tools to manage tasks and evaluate results.
- Task decomposition for complex problems - Result evaluation and validation - Management and execution of Ollama models - Standardized communication via MCP protocol - Advanced error handling with detailed messages - Performance optimizations like connection pooling and LRU caching
- Decomposing complex tasks into manageable subtasks.
- Evaluating task results against specified criteria.
- Running queries on Ollama models for various applications.
Add to your AI client
Use these steps to connect ollama-MCP-server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"ollama-mcp-server-newaitees": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-server-newaitees"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"ollama-mcp-server-newaitees": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-server-newaitees"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"ollama-mcp-server-newaitees": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-server-newaitees"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"ollama-mcp-server-newaitees": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-server-newaitees"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"ollama-mcp-server-newaitees": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-server-newaitees"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"ollama-mcp-server-newaitees": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-server-newaitees"
]
}
}
}FAQ
What is the purpose of the ollama-MCP-server?
It serves as a bridge between local Ollama LLM instances and applications using the MCP protocol, enhancing task management and evaluation.
How do I install the ollama-MCP-server?
You can install it using pip with the command `pip install ollama-mcp-server`.
Can I customize the server settings?
Yes, you can adjust performance-related settings in the `config.py` file.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":1859,"uuid":"8c4d221b-913b-473b-8c7f-176d024517b2","name":"ollama-MCP-server","title":"ollama-MCP-server","description":"","avatar_url":"https://avatars.githubusercontent.com/u/200696569?v=4","created_at":"2025-03-05T05:43:56.980Z","updated_at":"2025-03-12T10:21:16.561Z","status":"created","author_name":"NewAITees","author_avatar_url":"https://avatars.githubusercontent.com/u/200696569?v=4","tags":"ollama,MCP,server","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/NewAITees/ollama-MCP-server","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"0\",\"license\":\"\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-03-03 08:19:44\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]