MCP-LLM Bridge
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Overview
MCP-LLM Bridge is a TypeScript implementation that connects local Large Language Models (LLMs) via Ollama to Model Context Protocol (MCP) servers, enabling the use of advanced tools similar to those used by Claude.
To use the MCP-LLM Bridge, install the Ollama and MCP servers, configure the necessary credentials, and start the bridge. You can then send prompts or commands to interact with your local LLM and leverage MCP capabilities.
- Multi-MCP support with dynamic tool routing - Structured output validation for tool calls - Automatic tool detection based on user input - Comprehensive logging and error handling - Full integration with local models for various tasks including web search and email management
- Managing files and directories through local commands
- Conducting web searches with Brave Search
- Sending and managing emails via Gmail integration
- Image generation through Flux
- Interacting with GitHub repositories
Add to your AI client
Use these steps to connect MCP-LLM Bridge in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"ollama-mcp-bridge-patruff": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-bridge-patruff"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"ollama-mcp-bridge-patruff": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-bridge-patruff"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"ollama-mcp-bridge-patruff": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-bridge-patruff"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"ollama-mcp-bridge-patruff": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-bridge-patruff"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"ollama-mcp-bridge-patruff": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-bridge-patruff"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"ollama-mcp-bridge-patruff": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-ollama-mcp-bridge-patruff"
]
}
}
}FAQ
How do I set up the MCP-LLM Bridge?
Install Ollama, required MCP servers, set the appropriate credentials, and configure the bridge using `bridge_config.json`.
Can this bridge work with any local LLM?
Yes, as long as the LLM is compatible with the Ollama framework.
Is it necessary to have an internet connection?
No, once set up, the bridge operates entirely locally, utilizing open-source models.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":426,"uuid":"a9d50d42-6e5a-49e8-b10b-96d95a0c776b","name":"ollama-mcp-bridge","title":"MCP-LLM Bridge","description":"Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools","avatar_url":"https://avatars.githubusercontent.com/u/14841709?v=4","created_at":"2024-12-16T07:22:59.578Z","updated_at":"2024-12-19T12:38:24.979Z","status":"created","author_name":"patruff","author_avatar_url":"https://avatars.githubusercontent.com/u/14841709?v=4","tags":"[]","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/patruff/ollama-mcp-bridge","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"640\",\"license\":\"MIT license\",\"language\":\"TypeScript\",\"is_official\":false,\"latest_commit_time\":\"2025-04-20 15:26:36\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]