drio
Open app

OpenAI MCP Server

Source

Mirror of

Catalog onlyCatalog onlySTDIO

Overview

OpenAI MCP Server is a tool that allows users to query OpenAI models directly from Claude using the MCP protocol.

To use the OpenAI MCP Server, you need to add the server configuration to your claude_desktop_config.json file and set up the necessary environment variables, including your OpenAI API key.

  • Direct querying of OpenAI models via MCP protocol - Easy setup with configuration file - Supports testing with pytest
  1. Integrating OpenAI's language models into applications using Claude.
  2. Automating responses in chat applications.
  3. Conducting research and development with AI models.

Add to your AI client

Use these steps to connect OpenAI MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "pierrebrunelle-mcp-server-openai-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-pierrebrunelle-mcp-server-openai-mcp-mirror"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "pierrebrunelle-mcp-server-openai-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-pierrebrunelle-mcp-server-openai-mcp-mirror"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "pierrebrunelle-mcp-server-openai-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-pierrebrunelle-mcp-server-openai-mcp-mirror"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "pierrebrunelle-mcp-server-openai-mcp-mirror": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-pierrebrunelle-mcp-server-openai-mcp-mirror"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "pierrebrunelle-mcp-server-openai-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-pierrebrunelle-mcp-server-openai-mcp-mirror"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "pierrebrunelle-mcp-server-openai-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-pierrebrunelle-mcp-server-openai-mcp-mirror"
      ]
    }
  }
}

FAQ

What is the MCP protocol?

The MCP protocol is a method for communicating with various AI models, allowing for seamless integration.

Is there a cost associated with using OpenAI MCP Server?

The server itself is free to use, but you will need an OpenAI API key, which may have associated costs depending on usage.

How do I run tests for the OpenAI MCP Server?

You can run tests using pytest by executing the command `pytest -v test_openai.py -s` from the project root.