drio
Open app

Uber Eats MCP Server

Source

Catalog onlyCatalog onlySTDIO

Overview

The Uber Eats MCP Server is a proof of concept (POC) demonstrating how to build Model Context Protocol (MCP) servers on top of the Uber Eats platform, enabling seamless integration between large language model (LLM) applications and external tools.

To use the Uber Eats MCP Server, set up a Python environment, install the required packages, and configure your API key in the .env file. Then, you can run the MCP inspector tool to start the server.

  • Integration with the Model Context Protocol (MCP) for LLM applications. - Easy setup with Python and required packages. - Debugging capabilities with the MCP inspector tool.
  1. Building applications that require integration with LLMs and external tools.
  2. Developing custom solutions for Uber Eats using the MCP framework.
  3. Experimenting with LLM capabilities in a controlled environment.

Add to your AI client

Use these steps to connect Uber Eats MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "uber-eats-mcp-server-ericzakariasson": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-uber-eats-mcp-server-ericzakariasson"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "uber-eats-mcp-server-ericzakariasson": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-uber-eats-mcp-server-ericzakariasson"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "uber-eats-mcp-server-ericzakariasson": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-uber-eats-mcp-server-ericzakariasson"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "uber-eats-mcp-server-ericzakariasson": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-uber-eats-mcp-server-ericzakariasson"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "uber-eats-mcp-server-ericzakariasson": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-uber-eats-mcp-server-ericzakariasson"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "uber-eats-mcp-server-ericzakariasson": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-uber-eats-mcp-server-ericzakariasson"
      ]
    }
  }
}

FAQ

What is MCP?

The Model Context Protocol (MCP) is an open protocol that facilitates integration between LLM applications and external tools.

What are the prerequisites for using the server?

You need Python 3.12 or higher and an API key from a supported LLM provider.

How do I run the server?

After setting up your environment and installing the required packages, you can run the MCP inspector tool with the command `uv run mcp dev server.py`.