Overview
OpenAI Complete MCP Server is an MCP (Model Context Protocol) server that provides a clean interface for LLMs to utilize text completion capabilities through the MCP protocol, acting as a bridge between an LLM client and OpenAI's compatible API.
To use the server, clone the repository, install dependencies, and start the server. You can also run it using Docker with the required environment variables.
- Provides a single tool named "complete" for generating text completions. - Handles asynchronous processing to avoid blocking. - Implements timeout handling with graceful fallbacks. - Supports cancellation of ongoing requests.
- Generating text completions for various applications.
- Integrating with LLM clients for enhanced text processing.
- Supporting developers in building applications that require text generation.
Add to your AI client
Use these steps to connect OpenAI Complete MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"mcp-openai-complete-aiamblichus": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-openai-complete-aiamblichus"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"mcp-openai-complete-aiamblichus": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-openai-complete-aiamblichus"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"mcp-openai-complete-aiamblichus": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-openai-complete-aiamblichus"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"mcp-openai-complete-aiamblichus": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-openai-complete-aiamblichus"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"mcp-openai-complete-aiamblichus": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-openai-complete-aiamblichus"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"mcp-openai-complete-aiamblichus": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-openai-complete-aiamblichus"
]
}
}
}FAQ
What is the primary use case of this server?
The primary use case is for base models, as the server does not support chat completions.
How do I configure the server?
You need to set environment variables such as OPENAI_API_KEY, OPENAI_API_BASE, and OPENAI_MODEL.
Is there a Docker option available?
Yes, you can build and run the server using Docker with the appropriate environment variables.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":3845,"uuid":"31d20b90-6973-4ef8-acf6-37c5d03be9a4","name":"mcp-openai-complete","title":"OpenAI Complete MCP Server","description":"MCP server for OpenAI text completion","avatar_url":"https://avatars.githubusercontent.com/u/32301?v=4","created_at":"2025-03-22T02:34:03.783Z","updated_at":"2025-03-22T02:45:19.056Z","status":"created","author_name":"aiamblichus","author_avatar_url":"https://avatars.githubusercontent.com/u/32301?v=4","tags":"mcp-openai-complete,text-completion,openai","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/aiamblichus/mcp-openai-complete","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"0\",\"license\":\"MIT license\",\"language\":\"JavaScript\",\"is_official\":false,\"latest_commit_time\":\"2025-03-21 23:40:10\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]