drio
Open app

n8n MCP Server

Source

MCP server that provides tools and resources for interacting with n8n API

Catalog onlyCatalog onlySTDIO

Overview

The n8n MCP Server is a Model Context Protocol (MCP) server that enables AI assistants to interact with n8n workflows using natural language, providing a seamless integration for workflow management.

To use the n8n MCP Server, install it via npm or from source, configure the necessary environment variables, and run the server. Integrate it with your AI assistant platform to manage workflows.

  • List, create, update, and delete workflows - Activate and deactivate workflows - Execute workflows and monitor their status - Access workflow information and execution statistics
  1. Automating business processes through AI-driven workflows
  2. Managing complex workflows in a user-friendly manner
  3. Integrating AI assistants for enhanced workflow execution

Add to your AI client

Use these steps to connect n8n MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "n8n-mcp-server-leonardsellem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-leonardsellem"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "n8n-mcp-server-leonardsellem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-leonardsellem"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "n8n-mcp-server-leonardsellem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-leonardsellem"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "n8n-mcp-server-leonardsellem": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-leonardsellem"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "n8n-mcp-server-leonardsellem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-leonardsellem"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "n8n-mcp-server-leonardsellem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-n8n-mcp-server-leonardsellem"
      ]
    }
  }
}

FAQ

What are the prerequisites for using n8n MCP Server?

You need Node.js 18 or later and an n8n instance with API access enabled.

How do I install n8n MCP Server?

You can install it globally using npm or clone the repository and build it from source.

Can I use n8n MCP Server with any AI assistant?

Yes, it can be integrated with various AI assistant platforms depending on their compatibility.