Overview
mcp-server-dify is a Model Context Protocol Server designed for Dify AI, enabling large language models (LLMs) to interact with Dify AI's chat completion capabilities through a standardized protocol.
To use mcp-server-dify, install it via NPM and configure it with your Dify API credentials in the claude_desktop_config.json file.
- Integration with Dify AI chat completion API - Restaurant recommendation tool (meshi-doko) - Support for conversation context - Streaming response support - TypeScript implementation
- Enabling chatbots to provide restaurant recommendations based on user queries.
- Facilitating LLMs to maintain conversation context during interactions.
- Streamlining API interactions for developers using Dify AI.
Add to your AI client
Use these steps to connect mcp-server-dify in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"mcp-server-dify-yuru-sha": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-dify-yuru-sha"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"mcp-server-dify-yuru-sha": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-dify-yuru-sha"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"mcp-server-dify-yuru-sha": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-dify-yuru-sha"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"mcp-server-dify-yuru-sha": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-dify-yuru-sha"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"mcp-server-dify-yuru-sha": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-dify-yuru-sha"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"mcp-server-dify-yuru-sha": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-server-dify-yuru-sha"
]
}
}
}FAQ
What is the purpose of mcp-server-dify?
It serves as a bridge for LLMs to utilize Dify AI's chat capabilities effectively.
How do I install mcp-server-dify?
You can install it using NPM with the command `npm install @modelcontextprotocol/server-dify`.
Is there a security concern with using this server?
Yes, ensure to keep your API credentials secure and use HTTPS for the API endpoint.