Python 从0到1构建MCP Server & Client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
Overview
Python MCP Server & Client is a project that implements the Model Context Protocol (MCP) to provide a standardized interface for AI models, connecting them to external data sources and tools such as file systems, databases, or APIs.
To use the MCP Server, install the required packages, initialize the project, and run the server using the provided commands. The client can be configured to connect to the server and process queries.
- Supports multiple transport protocols (Stdio and SSE) - Provides a unified interface for different AI model vendors - Allows querying of technical documentation for various frameworks
- Connecting AI models to various data sources seamlessly.
- Querying and retrieving documentation for frameworks like Langchain and OpenAI Agents.
- Facilitating communication between AI models and external tools.
Add to your AI client
Use these steps to connect Python 从0到1构建MCP Server & Client in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"python-mcp-server-client-gobinfan": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-python-mcp-server-client-gobinfan"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"python-mcp-server-client-gobinfan": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-python-mcp-server-client-gobinfan"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"python-mcp-server-client-gobinfan": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-python-mcp-server-client-gobinfan"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"python-mcp-server-client-gobinfan": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-python-mcp-server-client-gobinfan"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"python-mcp-server-client-gobinfan": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-python-mcp-server-client-gobinfan"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"python-mcp-server-client-gobinfan": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-python-mcp-server-client-gobinfan"
]
}
}
}FAQ
What is MCP?
MCP stands for Model Context Protocol, which standardizes the way AI models interact with external tools and data sources.
How do I install the MCP Server?
You can install it by following the setup instructions provided in the project documentation.
Can I use this with any AI model?
Yes, the MCP is designed to work with various AI models and frameworks.