cognee-mcp-server
Overview
The cognee-mcp-server is a server designed for 'cognee', an AI memory engine that builds a knowledge graph from user-input text and facilitates searching within it.
To use the cognee-mcp-server, configure it according to your environment, including setting up necessary parameters in the claude_desktop_config.json, and run it with the specified commands related to your installation path.
- Constructs knowledge graphs based on textual input. - Allows for searching within the generated knowledge graph. - Supports custom graph model implementations through flexible configuration options.
- Enhancing AI systems with memory by building contextual knowledge graphs.
- Performing efficient searches across complex datasets based on user queries.
- Supporting applications needing robust data retrieval and organization capabilities.
Add to your AI client
Use these steps to connect cognee-mcp-server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"cognee-mcp-server-topoteretes": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-cognee-mcp-server-topoteretes"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"cognee-mcp-server-topoteretes": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-cognee-mcp-server-topoteretes"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"cognee-mcp-server-topoteretes": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-cognee-mcp-server-topoteretes"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"cognee-mcp-server-topoteretes": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-cognee-mcp-server-topoteretes"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"cognee-mcp-server-topoteretes": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-cognee-mcp-server-topoteretes"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"cognee-mcp-server-topoteretes": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-cognee-mcp-server-topoteretes"
]
}
}
}FAQ
What is required to run cognee-mcp-server?
You will need to configure the server with your API key and local paths, and it works with various database providers.
Can I use custom models with cognee-mcp-server?
Yes, you can use your own Pydantic graph model implementations by specifying the filenames and class names in the configuration.
Is cognee-mcp-server compatible with existing AI frameworks?
Yes, it is designed to integrate smoothly with systems like Claude Desktop and can be adapted for other AI frameworks.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":477,"uuid":"c6dce01d-1d38-4c4a-b061-253190bc3a58","name":"cognee-mcp-server","title":"cognee-mcp-server","description":null,"avatar_url":"https://avatars.githubusercontent.com/u/125468716?v=4","created_at":"2024-12-19T02:14:17.301Z","updated_at":"2024-12-19T12:38:59.212Z","status":"created","author_name":"topoteretes","author_avatar_url":"https://avatars.githubusercontent.com/u/125468716?v=4","tags":"cognee,mcp-server,ai-memory-engine","category":"knowledge-and-memory","is_featured":false,"sort":1,"url":"https://github.com/topoteretes/cognee-mcp-server","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":null,"user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]