drio
Open app

MCP Server for ZenML

Source

MCP server to connect an MCP client (Cursor, Claude Desktop etc) with your ZenML MLOps and LLMOps pipelines

Catalog onlyCatalog onlySTDIO

Overview

Mcp Zenml is a server integration that allows users to chat with their MLOps and LLMOps pipelines using the ZenML API, facilitating interaction with machine learning models and data sources.

To use Mcp Zenml, you need to set up a ZenML Cloud server, install the required dependencies, and configure your MCP client to connect to the ZenML server. Detailed instructions are provided in the documentation.

  • Standardized Model Context Protocol (MCP) for AI applications - Access to live information about users, stacks, pipelines, and more - Ability to trigger new pipeline runs - Integration with popular clients like Claude Desktop and Cursor
  1. Managing and monitoring machine learning pipelines
  2. Interacting with large language models through a standardized protocol
  3. Facilitating data access from local and remote sources

Add to your AI client

Use these steps to connect MCP Server for ZenML in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-zenml-zenml-io": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-zenml-zenml-io"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-zenml-zenml-io": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-zenml-zenml-io"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-zenml-zenml-io": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-zenml-zenml-io"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-zenml-zenml-io": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-zenml-zenml-io"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-zenml-zenml-io": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-zenml-zenml-io"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-zenml-zenml-io": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-zenml-zenml-io"
      ]
    }
  }
}

FAQ

What is the Model Context Protocol (MCP)?

MCP is an open protocol that standardizes how applications provide context to AI models, enabling seamless integration with various data sources.

Is Mcp Zenml free to use?

Yes! You can sign up for a free trial of ZenML Cloud to get started.

What are the prerequisites for using Mcp Zenml?

You need access to a ZenML Cloud server and the `uv` tool installed locally.