drio
Open app

MCP Web UI

Source

MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.

Catalog onlyCatalog onlySTDIO

Overview

MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture, designed to facilitate interactions with Large Language Models (LLMs).

To use MCP Web UI, clone the repository, configure your environment with API keys for the desired LLM providers, and run the application either locally or via Docker.

  • Multi-provider LLM integration (Anthropic, OpenAI, Ollama, OpenRouter) - Intuitive chat interface with real-time response streaming - Dynamic configuration management and advanced context aggregation - Persistent chat history using BoltDB
  1. Interacting with various AI language models through a unified interface.
  2. Real-time chat experiences for customer support or educational purposes.
  3. Managing context and configurations for different LLMs seamlessly.

Add to your AI client

Use these steps to connect MCP Web UI in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-web-ui-megagrindstone": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-web-ui-megagrindstone"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-web-ui-megagrindstone": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-web-ui-megagrindstone"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-web-ui-megagrindstone": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-web-ui-megagrindstone"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-web-ui-megagrindstone": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-web-ui-megagrindstone"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-web-ui-megagrindstone": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-web-ui-megagrindstone"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-web-ui-megagrindstone": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-web-ui-megagrindstone"
      ]
    }
  }
}

FAQ

Can I use multiple LLM providers?

Yes! MCP Web UI supports multiple LLM providers for flexible usage.

Is there a demo available?

Yes! A demo video is available on YouTube to showcase the features.

What are the prerequisites for installation?

You need Go 1.23+, Docker (optional), and API keys for the LLM providers.