MCP Server: Scalable OpenAPI Endpoint Discovery and API Request Tool
A MCP server that enables Claude to discover and call any API endpoint through semantic search. Intelligently chunks OpenAPI specifications to handle large API documentation, with built-in request execution capabilities. Perfect for integrating private APIs with Claude Desktop.
Overview
MCP Server is a scalable tool designed for OpenAPI endpoint discovery and API request handling, enabling efficient interaction with APIs through natural language queries.
To use MCP Server, deploy it via Docker with the appropriate OpenAPI JSON URL and API prefix, then interact with it through the MCP Client (Claude Desktop) by sending natural language requests.
- Utilizes remote OpenAPI JSON files without local file system access. - Implements semantic search using an optimized MiniLM-L3 model for fast endpoint discovery. - Supports async operations with FastAPI for improved performance. - Handles large OpenAPI specifications by chunking endpoints for context preservation.
- Discovering API endpoints based on user queries.
- Making API requests with complex parameters and headers.
- Integrating with various APIs in finance and healthcare sectors.
Add to your AI client
Use these steps to connect MCP Server: Scalable OpenAPI Endpoint Discovery and API Request Tool in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"mcp-server-any-openapi-baryhuang": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-any-openapi-baryhuang"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"mcp-server-any-openapi-baryhuang": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-any-openapi-baryhuang"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"mcp-server-any-openapi-baryhuang": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-any-openapi-baryhuang"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"mcp-server-any-openapi-baryhuang": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-any-openapi-baryhuang"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"mcp-server-any-openapi-baryhuang": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-any-openapi-baryhuang"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"mcp-server-any-openapi-baryhuang": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-any-openapi-baryhuang"
]
}
}
}FAQ
Can MCP Server handle large OpenAPI specifications?
Yes, it can process specifications up to 10MB efficiently by indexing individual endpoints.
Is there a cold start penalty?
Yes, there is a cold start penalty of approximately 15 seconds for model loading if not using the Docker image.
How can I install MCP Server?
You can install it via Docker or using pip with the command: `pip install mcp-server-any-openapi`.