drio
Open app

Insight MCP Server

Source

MCP Server for Development Utilities

Catalog onlyCatalog onlySTDIO

Overview

Insight MCP Server is a Model Context Protocol (MCP) server designed to provide software development automation and assistance through integration with large language models (LLMs).

To use Insight MCP Server, set up your environment variables in a .env file, install the necessary dependencies, and run the server using Python.

  • Flexible support for various LLM providers, including OpenAI GPT-4 and Anthropic Claude. - Workflow automation capabilities through MCP tools. - Customizable environment-based configuration.
  1. Automating software development workflows.
  2. Assisting developers with code generation and debugging.
  3. Integrating LLMs into existing development environments for enhanced productivity.

Add to your AI client

Use these steps to connect Insight MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "insight-keplerops": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-insight-keplerops"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "insight-keplerops": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-insight-keplerops"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "insight-keplerops": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-insight-keplerops"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "insight-keplerops": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-insight-keplerops"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "insight-keplerops": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-insight-keplerops"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "insight-keplerops": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-insight-keplerops"
      ]
    }
  }
}

FAQ

What LLM providers are supported?

The server supports OpenAI and Anthropic as LLM providers.

How do I install the Insight MCP Server?

You can install it by setting up your environment variables and running `pip install .` in your terminal.

Is there a license for this project?

Yes, the Insight MCP Server is licensed under the MIT License.