Overview
mcpx-py is a Python client library designed for interacting with large language models (LLMs) using mcp.run tools, enabling seamless integration with various AI providers.
To use mcpx-py, install it via pip or uv, set up your mcp.run session, and then utilize the library to interact with different AI models through a chat interface or command line.
- Supports multiple AI providers including OpenAI, Claude, and Gemini. - Real-time chat interface for interactive conversations with AI models. - Tool suggestion and execution within conversations. - Local and cloud-based AI provider support.
- Building chatbots that leverage advanced AI models.
- Automating tasks through AI-driven commands.
- Integrating AI capabilities into Python applications.
Add to your AI client
Use these steps to connect mcpx-py in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"mcpx-py-dylibso": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcpx-py-dylibso"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"mcpx-py-dylibso": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcpx-py-dylibso"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"mcpx-py-dylibso": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcpx-py-dylibso"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"mcpx-py-dylibso": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcpx-py-dylibso"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"mcpx-py-dylibso": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcpx-py-dylibso"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"mcpx-py-dylibso": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcpx-py-dylibso"
]
}
}
}FAQ
**What AI providers are supported?**
mcpx-py supports several AI providers including OpenAI, Claude, Ollama, and Gemini.
**Is mcpx-py free to use?**
Yes! mcpx-py is open-source and free to use.
**How do I set up my environment for mcpx-py?**
You need to install the library, generate a session ID using mcp.run, and set the necessary environment variables for your chosen AI provider.