drio
Open app

LLM Bridge MCP

Source

A model-agnostic Message Control Protocol (MCP) server that enables seamless integration with various Large Language Models (LLMs) like GPT, DeepSeek, Claude, and more.

Catalog onlyCatalog onlySTDIO

Overview

LLM Bridge MCP is a model-agnostic Message Control Protocol (MCP) server that facilitates seamless integration with various Large Language Models (LLMs) such as GPT, DeepSeek, and Claude.

To use LLM Bridge MCP, clone the repository, install the necessary dependencies, and configure your API keys in a .env file. You can then run the server and connect it to your applications.

  • Unified interface for multiple LLM providers including OpenAI, Anthropic, and Google. - Built with Pydantic AI for type safety and validation. - Customizable parameters like temperature and max tokens. - Usage tracking and metrics.
  1. Integrating multiple LLMs into a single application.
  2. Switching between different LLM providers seamlessly.
  3. Customizing model parameters for specific tasks.

Add to your AI client

Use these steps to connect LLM Bridge MCP in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "llm-bridge-mcp-sjquant": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-bridge-mcp-sjquant"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "llm-bridge-mcp-sjquant": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-bridge-mcp-sjquant"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "llm-bridge-mcp-sjquant": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-bridge-mcp-sjquant"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "llm-bridge-mcp-sjquant": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-bridge-mcp-sjquant"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "llm-bridge-mcp-sjquant": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-bridge-mcp-sjquant"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "llm-bridge-mcp-sjquant": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-bridge-mcp-sjquant"
      ]
    }
  }
}

FAQ

Can I use LLM Bridge MCP with any LLM?

Yes! LLM Bridge MCP is designed to work with various LLMs through a standardized interface.

Is LLM Bridge MCP free to use?

Yes! LLM Bridge MCP is open-source and free to use.

How do I troubleshoot common issues?

Common issues can be resolved by checking your configuration and ensuring that all dependencies are correctly installed.