drio
Open app

MCP LLM Bridge

Source

MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs

Catalog onlyCatalog onlySTDIO

Overview

MCP LLM Bridge is a tool that facilitates communication between Model Context Protocol (MCP) servers and OpenAI-compatible language models. It serves as a bidirectional protocol translation layer that enables the integration of MCP-compliant tools with OpenAI's function-calling interface.

To use MCP LLM Bridge, install it following the provided quick start guide, set up your OpenAI API keys and configurations, and then run the bridge to connect to your desired models.

  • Support for OpenAI API and local endpoints implementing the OpenAI API specification. - Automatic translation of MCP tool specifications into OpenAI function schemas. - Ability to handle requests and responses between MCP tools and OpenAI models seamlessly.
  1. Enabling applications to utilize OpenAI models with specialized MCP tools.
  2. Creating a standardized interface for developers using both MCP and OpenAI technologies.
  3. Facilitating local model implementations alongside cloud-based solutions.

Add to your AI client

Use these steps to connect MCP LLM Bridge in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-llm-bridge-bartolli": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-llm-bridge-bartolli"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-llm-bridge-bartolli": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-llm-bridge-bartolli"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-llm-bridge-bartolli": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-llm-bridge-bartolli"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-llm-bridge-bartolli": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-llm-bridge-bartolli"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-llm-bridge-bartolli": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-llm-bridge-bartolli"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-llm-bridge-bartolli": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-llm-bridge-bartolli"
      ]
    }
  }
}

FAQ

Can I use MCP LLM Bridge with other models besides OpenAI?

Yes! The bridge supports any endpoint that adheres to the OpenAI API specification, enabling broad compatibility.

Is there a demo available for MCP LLM Bridge?

Yes! A demonstration GIF is included in the project documentation to showcase the functionality.

How do I configure my OpenAI credentials?

You can set your OpenAI API credentials in a `.env` file as specified in the installation instructions.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":98,"uuid":"a077f6a4-a0c2-433d-ba18-739670fede99","name":"mcp-llm-bridge","title":"MCP LLM Bridge","description":"MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs","avatar_url":"https://avatars.githubusercontent.com/u/3233943?v=4","created_at":"$D2024-12-13T09:00:46.707Z","updated_at":"$D2024-12-13T09:39:17.269Z","status":"created","author_name":"bartolli","author_avatar_url":"https://avatars.githubusercontent.com/u/3233943?v=4","tags":"[]","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/bartolli/mcp-llm-bridge","target":"_self","content":"$29","summary":"$2a","img_url":"https://github.com/bartolli/mcp-llm-bridge/raw/main/assets/output.gif","type":null,"metadata":null,"user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]