drio
Open app

Claude-LMStudio-Bridge

Source

A bridge MCP server that allows Claude to communicate with locally running LLM models via LM Studio

Catalog onlyCatalog onlySTDIO

Overview

Claude-LMStudio-Bridge is a Model Control Protocol (MCP) server that facilitates communication between Claude and locally running LLM models via LM Studio.

To use the bridge, clone the repository, set up a virtual environment, install the required packages, and run the bridge server while ensuring LM Studio is running locally with your preferred model loaded.

  • Enables Claude to send prompts to local models and receive responses. - Allows comparison of Claude's responses with other models. - Facilitates access to specialized local models for specific tasks. - Supports running queries locally, keeping sensitive data secure.
  1. Comparing responses from Claude and other LLM models.
  2. Utilizing local models for specific tasks without API limitations.
  3. Keeping sensitive queries local to ensure privacy.

Add to your AI client

Use these steps to connect Claude-LMStudio-Bridge in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "claude-lmstudio-bridge-v2-infinitimeless": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-claude-lmstudio-bridge-v2-infinitimeless"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "claude-lmstudio-bridge-v2-infinitimeless": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-claude-lmstudio-bridge-v2-infinitimeless"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "claude-lmstudio-bridge-v2-infinitimeless": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-claude-lmstudio-bridge-v2-infinitimeless"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "claude-lmstudio-bridge-v2-infinitimeless": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-claude-lmstudio-bridge-v2-infinitimeless"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "claude-lmstudio-bridge-v2-infinitimeless": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-claude-lmstudio-bridge-v2-infinitimeless"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "claude-lmstudio-bridge-v2-infinitimeless": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-claude-lmstudio-bridge-v2-infinitimeless"
      ]
    }
  }
}

FAQ

What are the prerequisites for using the bridge?

You need Python 3.8+, Anthropic Claude with MCP capability, and LM Studio running locally with loaded models.

How do I start the bridge server?

Run the command `python lmstudio_bridge.py` after setting up your environment and ensuring LM Studio is running.

Can I customize the connection settings?

Yes, you can modify the `LMSTUDIO_API_BASE` variable in `lmstudio_bridge.py` to change the connection port.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":3805,"uuid":"04064973-9cb5-4337-8ce9-7cd981d6a001","name":"Claude-LMStudio-Bridge_V2","title":"Claude-LMStudio-Bridge","description":"A bridge MCP server that allows Claude to communicate with locally running LLM models via LM Studio","avatar_url":"https://avatars.githubusercontent.com/u/127632852?v=4","created_at":"2025-03-21T18:42:00.723Z","updated_at":"2025-03-21T18:57:51.397Z","status":"created","author_name":"infinitimeless","author_avatar_url":"https://avatars.githubusercontent.com/u/127632852?v=4","tags":"claude,llm,mcp,bridge","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/infinitimeless/Claude-LMStudio-Bridge_V2","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"0\",\"license\":\"MIT license\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-03-21 18:22:32\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]