drio
Open app

MCP LLM Bridge

Source

A Simple bridge from Ollama to a fetch url mcp server

Catalog onlyCatalog onlySTDIO

Overview

MCP LLM Bridge is a tool that connects Model Context Protocol (MCP) servers to OpenAI-compatible large language models (LLMs) like Ollama, facilitating seamless communication between them.

To use MCP LLM Bridge, follow these steps:

  1. Install the necessary components using the provided installation script.
  2. Clone the repository and navigate to the project directory.
  3. Set up a virtual environment and install the required packages.
  4. Configure the bridge parameters in the main Python file to connect to your MCP server and LLM.
  • Connects MCP servers to OpenAI-compatible LLMs. - Supports any endpoint implementing the OpenAI API specification. - Easy installation and setup process.
  1. Integrating custom LLMs with MCP servers for enhanced functionality.
  2. Facilitating communication between different AI models and protocols.
  3. Enabling developers to test and deploy LLMs in various environments.

Add to your AI client

Use these steps to connect MCP LLM Bridge in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "simple-mcp-ollama-bridge-virajsharma2000": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-simple-mcp-ollama-bridge-virajsharma2000"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "simple-mcp-ollama-bridge-virajsharma2000": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-simple-mcp-ollama-bridge-virajsharma2000"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "simple-mcp-ollama-bridge-virajsharma2000": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-simple-mcp-ollama-bridge-virajsharma2000"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "simple-mcp-ollama-bridge-virajsharma2000": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-simple-mcp-ollama-bridge-virajsharma2000"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "simple-mcp-ollama-bridge-virajsharma2000": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-simple-mcp-ollama-bridge-virajsharma2000"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "simple-mcp-ollama-bridge-virajsharma2000": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-simple-mcp-ollama-bridge-virajsharma2000"
      ]
    }
  }
}

FAQ

What is the license for MCP LLM Bridge?

MCP LLM Bridge is licensed under the MIT license.

Can I contribute to the project?

Yes! Contributions are welcome through pull requests.

What programming language is used for MCP LLM Bridge?

The project is developed in Python.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":3054,"uuid":"cb2222ff-d7ec-41b7-8516-fdfbd36d76d5","name":"simple-mcp-ollama-bridge","title":"MCP LLM Bridge","description":"A Simple bridge from Ollama to a fetch url mcp server","avatar_url":"https://avatars.githubusercontent.com/u/86789278?v=4","created_at":"$D2025-03-17T02:42:49.484Z","updated_at":"$D2025-03-17T03:55:47.946Z","status":"created","author_name":"virajsharma2000","author_avatar_url":"https://avatars.githubusercontent.com/u/86789278?v=4","tags":"[]","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/virajsharma2000/simple-mcp-ollama-bridge","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"0\",\"license\":\"MIT license\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-03-17 01:15:39\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]