drio
Open app

cognee-mcp-server

Source

Mirror of

Catalog onlyCatalog onlySTDIO

Overview

The cognee-mcp-server is an MCP server designed for Cognee, an AI memory engine that builds knowledge graphs from input text and performs searches within them.

To use the cognee-mcp-server, install the necessary dependencies, configure your environment, and run the server with the appropriate commands to build and search knowledge graphs.

  • Builds knowledge graphs from input text. - Performs efficient searches within the constructed knowledge graphs. - Configurable for different environments and use cases.
  1. Creating knowledge graphs for research data.
  2. Enhancing AI applications with memory capabilities.
  3. Performing complex queries on structured data.

Add to your AI client

Use these steps to connect cognee-mcp-server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "topoteretes-cognee-mcp-server-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-topoteretes-cognee-mcp-server-mcp-mirror"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "topoteretes-cognee-mcp-server-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-topoteretes-cognee-mcp-server-mcp-mirror"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "topoteretes-cognee-mcp-server-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-topoteretes-cognee-mcp-server-mcp-mirror"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "topoteretes-cognee-mcp-server-mcp-mirror": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-topoteretes-cognee-mcp-server-mcp-mirror"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "topoteretes-cognee-mcp-server-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-topoteretes-cognee-mcp-server-mcp-mirror"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "topoteretes-cognee-mcp-server-mcp-mirror": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-topoteretes-cognee-mcp-server-mcp-mirror"
      ]
    }
  }
}

FAQ

What is the purpose of the cognee-mcp-server?

The cognee-mcp-server is designed to facilitate the construction and querying of knowledge graphs for AI applications.

How do I configure the server?

Configuration involves setting up environment variables and specifying command-line arguments in the configuration file.

Can I use custom graph models?

Yes, the server allows for the use of custom pydantic graph models for advanced use cases.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":551,"uuid":"33315c5d-a668-4a40-bf8a-2efdd196a753","name":"topoteretes_cognee-mcp-server","title":"cognee-mcp-server","description":"Mirror of","avatar_url":"https://avatars.githubusercontent.com/u/192820360?v=4","created_at":"2024-12-30T02:55:33.409Z","updated_at":"2025-02-23T07:15:12.282Z","status":"created","author_name":"MCP-Mirror","author_avatar_url":"https://avatars.githubusercontent.com/u/192820360?v=4","tags":"cognee,mcp-server,ai-memory-engine","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/MCP-Mirror/topoteretes_cognee-mcp-server","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":null,"user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]