LOTUS-MCP
Integration two AI's into a modernized MCP for better performance
Overview
LOTUS-MCP is an open-source solution that integrates two AI models, Mistral and Gemini, into a modernized Model Context Protocol (MCP) for enhanced performance and interoperability.
To use LOTUS-MCP, developers can follow a step-by-step guide to build a unified MCP system that connects external tools and data sources with AI models, allowing for seamless processing of requests.
- Unified interface for multiple AI models - Context sharing across different AI systems - Tool reusability with common connectors - Cost optimization through smart routing - Automatic fallback support between models
- Integrating AI models for enhanced data processing
- Developing custom protocols for specific business needs
- Creating a unified interface for AI-driven applications
Add to your AI client
Use these steps to connect LOTUS-MCP in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"mcp-blue-lotus-org": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-blue-lotus-org"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"mcp-blue-lotus-org": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-blue-lotus-org"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"mcp-blue-lotus-org": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-blue-lotus-org"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"mcp-blue-lotus-org": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-blue-lotus-org"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"mcp-blue-lotus-org": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-blue-lotus-org"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"mcp-blue-lotus-org": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-blue-lotus-org"
]
}
}
}FAQ
Is LOTUS-MCP free to use?
Yes! LOTUS-MCP is a free and open-source software solution.
What AI models does LOTUS-MCP support?
LOTUS-MCP currently supports Mistral and Gemini models.
How can I contribute to LOTUS-MCP?
Contributions are welcome! You can contribute by submitting issues or pull requests on the GitHub repository.