drio
Open app

n8n MCP Server

Source

Complete MCP server for n8n workflow management in Cursor

Catalog onlyCatalog onlySTDIO

Overview

The n8n MCP Server is a Model Context Protocol (MCP) server that facilitates the management of n8n workflows directly within LLMs and AI agents, enabling seamless integration and execution of workflows.

To use the n8n MCP Server, install the package via npm, configure your n8n connection in a .env file, and start the server. You can then interact with the server using HTTP requests to manage workflows.

  • List available workflows from n8n - View workflow details - Execute workflows - Monitor workflow executions - Pass parameters to workflows - MCP-compatible interface for AI agents
  1. Automating data processing workflows in n8n.
  2. Integrating n8n workflows with AI agents for enhanced automation.
  3. Monitoring and managing workflow executions in real-time.

Add to your AI client

Use these steps to connect n8n MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "n8n-mcp-server-complete-dopehunter": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-complete-dopehunter"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "n8n-mcp-server-complete-dopehunter": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-complete-dopehunter"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "n8n-mcp-server-complete-dopehunter": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-complete-dopehunter"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "n8n-mcp-server-complete-dopehunter": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-complete-dopehunter"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "n8n-mcp-server-complete-dopehunter": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-complete-dopehunter"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "n8n-mcp-server-complete-dopehunter": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-complete-dopehunter"
      ]
    }
  }
}

FAQ

What is the purpose of the n8n MCP Server?

It allows for the management and execution of n8n workflows within AI agents using the Model Context Protocol.

How do I troubleshoot connection issues?

Ensure your n8n instance is running and accessible, and verify your API key is correct.

Can I use n8n MCP Server with Docker?

Yes, you can run the n8n MCP Server in a Docker container.