OpenAI MCP Server
Query OpenAI models directly from Claude using MCP protocol.
Overview
OpenAI MCP Server allows users to query OpenAI models directly from Claude using the MCP protocol.
To use the OpenAI MCP Server, you need to add the server configuration to your claude_desktop_config.json file and set up the necessary environment variables, including your OpenAI API key.
- Direct querying of OpenAI models from Claude. - Easy setup with configuration in JSON format. - Supports Python for development and testing.
- Integrating OpenAI's language models into desktop applications.
- Automating responses in chat applications using OpenAI's capabilities.
- Testing and developing new features with OpenAI's API.
Add to your AI client
Use these steps to connect OpenAI MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"mcp-server-openai-pierrebrunelle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-openai-pierrebrunelle"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"mcp-server-openai-pierrebrunelle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-openai-pierrebrunelle"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"mcp-server-openai-pierrebrunelle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-openai-pierrebrunelle"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"mcp-server-openai-pierrebrunelle": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-openai-pierrebrunelle"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"mcp-server-openai-pierrebrunelle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-openai-pierrebrunelle"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"mcp-server-openai-pierrebrunelle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-openai-pierrebrunelle"
]
}
}
}FAQ
What is the MCP protocol?
The MCP protocol is a method for communicating with OpenAI models, allowing for efficient queries and responses.
Is there a license for using OpenAI MCP Server?
Yes, it is licensed under the MIT License, allowing for free use and modification.
How do I run tests for the OpenAI MCP Server?
You can run tests using pytest by executing `pytest -v test_openai.py -s` from the project root.