🦜 🔗 LangChain MCP Client
🦜🔗 LangChain Model Context Protocol (MCP) Client
Overview
LangChain MCP Client is a tool that allows users to connect to Model Context Protocol (MCP) servers and utilize LangChain-compatible language models for various applications.
To use the LangChain MCP Client, install it via pip, configure your API keys and MCP servers in a .env file, and run the client from the command line.
- Seamless connection to any MCP servers. - Flexibility to use any LangChain-compatible LLM. - Command-line interface for dynamic interactions.
- Connecting to multiple MCP servers for enhanced model capabilities.
- Utilizing various language models for different tasks.
- Facilitating dynamic conversations through CLI interactions.
Add to your AI client
Use these steps to connect 🦜 🔗 LangChain MCP Client in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"langchain-mcp-client-datalayer": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-langchain-mcp-client-datalayer"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"langchain-mcp-client-datalayer": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-langchain-mcp-client-datalayer"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"langchain-mcp-client-datalayer": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-langchain-mcp-client-datalayer"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"langchain-mcp-client-datalayer": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-langchain-mcp-client-datalayer"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"langchain-mcp-client-datalayer": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-langchain-mcp-client-datalayer"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"langchain-mcp-client-datalayer": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-langchain-mcp-client-datalayer"
]
}
}
}FAQ
What is the required Python version?
Python 3.11 or higher is required to run the LangChain MCP Client.
How do I configure the client?
Create a `.env` file with your API keys and configure the `llm_mcp_config.json5` file for LLM parameters and MCP servers.
Can I use it with any language model?
Yes, as long as the model is compatible with LangChain.