MCP Coding Assistant with support for OpenAI + other LLM Providers
OpenAI Code Assistant Model Context Protocol (MCP) Server
Overview
Claude Code Python Edition is a powerful Python recreation of the Claude Code with enhanced real-time visualization, cost management, and Model Context Protocol (MCP) server capabilities. It provides a natural language interface for software development tasks with support for multiple LLM providers.
To use Claude Code, clone the repository, install the dependencies, and set up your API keys in a .env file. You can run it in CLI mode or as an MCP server.
- Multi-provider support for OpenAI, Anthropic, and other LLM providers. - Model Context Protocol integration for running as an MCP server. - Real-time tool visualization to see execution progress. - Cost management features to track token usage and expenses. - Comprehensive tool suite for file operations, command execution, and more. - Enhanced UI with rich terminal interface and syntax highlighting. - Context optimization for smart conversation management. - Multi-agent coordination for complex problem solving.
- Assisting in software development tasks with natural language queries.
- Running multiple agents to collaborate on complex projects.
- Visualizing tool execution in real-time for better understanding.
Add to your AI client
Use these steps to connect MCP Coding Assistant with support for OpenAI + other LLM Providers in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"openai-mcp-arthurcolle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-openai-mcp-arthurcolle"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"openai-mcp-arthurcolle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-openai-mcp-arthurcolle"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"openai-mcp-arthurcolle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-openai-mcp-arthurcolle"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"openai-mcp-arthurcolle": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-openai-mcp-arthurcolle"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"openai-mcp-arthurcolle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-openai-mcp-arthurcolle"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"openai-mcp-arthurcolle": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-openai-mcp-arthurcolle"
]
}
}
}FAQ
Can Claude Code work with different LLM providers?
Yes! It supports multiple providers including OpenAI and Anthropic.
Is there a cost associated with using Claude Code?
While the tool itself is free, usage costs may apply based on the LLM provider's pricing.
How can I run Claude Code as an MCP server?
You can run it by executing `python claude.py serve` after setting up your environment.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":2183,"uuid":"8cfb94ad-f399-40e6-bf52-8e5317438a82","name":"openai-mcp","title":"MCP Coding Assistant with support for OpenAI + other LLM Providers","description":"OpenAI Code Assistant Model Context Protocol (MCP) Server","avatar_url":"https://avatars.githubusercontent.com/u/43867988?v=4","created_at":"2025-03-09T03:57:48.262Z","updated_at":"2025-03-12T10:19:32.400Z","status":"created","author_name":"arthurcolle","author_avatar_url":"https://avatars.githubusercontent.com/u/43867988?v=4","tags":"[]","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/arthurcolle/openai-mcp","target":"_self","content":"$29","summary":"$2a","img_url":"https://camo.githubusercontent.com/ac892e3817eb6a9c9d3342be0cde3db8be518803e934d938046faece231ece53/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f76657273696f6e2d302e312e302d626c7565","type":null,"metadata":"{\"star\":\"6\",\"license\":\"\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-03-16 15:15:35\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]