MCP Proxy POC
Overview
MCP Proxy POC is a proof of concept project designed to demonstrate the capabilities of the RelevanceAI MCP (Multi-Cloud Proxy) server.
To use MCP Proxy POC, clone the repository, install the necessary packages, configure your environment variables, and run the server using the provided script.
- Easy setup with npm installation - Environment configuration for RelevanceAI authentication - Logging capabilities for troubleshooting
- Testing multi-cloud proxy functionalities
- Demonstrating integration with RelevanceAI services
- Developing and testing cloud-based applications
Add to your AI client
Use these steps to connect MCP Proxy POC in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"relevanceai-mcp-server-relevanceai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-relevanceai-mcp-server-relevanceai"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"relevanceai-mcp-server-relevanceai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-relevanceai-mcp-server-relevanceai"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"relevanceai-mcp-server-relevanceai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-relevanceai-mcp-server-relevanceai"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"relevanceai-mcp-server-relevanceai": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-relevanceai-mcp-server-relevanceai"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"relevanceai-mcp-server-relevanceai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-relevanceai-mcp-server-relevanceai"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"relevanceai-mcp-server-relevanceai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-relevanceai-mcp-server-relevanceai"
]
}
}
}FAQ
What is the purpose of this project?
This project serves as a proof of concept for the RelevanceAI MCP server, showcasing its features and capabilities.
How do I troubleshoot issues?
You can check the logs in the `logs.txt` file for any errors or issues that arise during setup or execution.
Are there any limitations?
Yes, there are some limitations noted in the documentation, and users should refer to the project for more details.