drio
Open app

MemoryMesh

Source

A knowledge graph server that uses the Model Context Protocol (MCP) to provide structured memory persistence for AI models. v0.2.8

Catalog onlyCatalog onlySTDIO

Overview

MemoryMesh is a knowledge graph server designed to provide structured memory persistence for AI models, especially effective for text-based RPGs and interactive storytelling.

To use MemoryMesh, clone the repository from GitHub, install the dependencies, and run the server. Users can define schemas to manage structured information, which the AI will utilize to interact dynamically.

  • Dynamic schema-based tools for data management. - Intuitive schema design to guide AI interactions. - Rich metadata support for improved AI context. - Event support to track operations on the knowledge graph. - A Memory Viewer tool for visualizing the graph.
  1. Managing character and world data in text-based RPGs.
  2. Simulating social networks with interactive elements.
  3. Organizing structured data for any AI applications needing consistent context.

Add to your AI client

Use these steps to connect MemoryMesh in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "memorymesh-chemiguel23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-memorymesh-chemiguel23"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "memorymesh-chemiguel23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-memorymesh-chemiguel23"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "memorymesh-chemiguel23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-memorymesh-chemiguel23"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "memorymesh-chemiguel23": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-memorymesh-chemiguel23"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "memorymesh-chemiguel23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-memorymesh-chemiguel23"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "memorymesh-chemiguel23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-memorymesh-chemiguel23"
      ]
    }
  }
}

FAQ

What is the Model Context Protocol (MCP)?

MCP provides a framework for structuring memory and interaction in AI systems.

How do I install MemoryMesh?

Follow the installation steps provided in the documentation, starting with cloning the repository and setting up the environment.

Can I customize schema definitions?

Yes, users can create and modify schemas to suit their specific data needs.