drio
Open app

LLM Chat Replay

Source

chat replay visualizer for sesssions saved in your LLM. create transcripts using your LLM + MCP filesystem server, and replay them here, to share with others

Catalog onlyCatalog onlySTDIO

Overview

LLM Chat Replay is a React application designed to visualize chat transcripts from LLM sessions, providing a replay feature with typing animations and playback controls.

To use LLM Chat Replay, set up an MCP filesystem server, save your chat transcript as a markdown file, and upload it to the application for playback.

  • Drag and drop markdown file upload - Playback controls (play/pause) - Speed control (0.5x to 4x) - Progress bar scrubbing - Auto-scrolling chat window - Distinct bubbles for Human and Assistant messages - Typing animation for Assistant responses
  1. Reviewing past conversations with LLMs for better understanding.
  2. Analyzing chat interactions for research or development purposes.
  3. Creating educational content based on chat transcripts.

Add to your AI client

Use these steps to connect LLM Chat Replay in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "llm-chat-replay-jonmadison": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-chat-replay-jonmadison"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "llm-chat-replay-jonmadison": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-chat-replay-jonmadison"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "llm-chat-replay-jonmadison": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-chat-replay-jonmadison"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "llm-chat-replay-jonmadison": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-chat-replay-jonmadison"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "llm-chat-replay-jonmadison": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-chat-replay-jonmadison"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "llm-chat-replay-jonmadison": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-chat-replay-jonmadison"
      ]
    }
  }
}

FAQ

What formats does LLM Chat Replay support for transcripts?

LLM Chat Replay works with markdown files formatted with "**Human**:" and "**Assistant**:" markers.

Do I need to install anything to use LLM Chat Replay?

Yes, you need to set up an MCP filesystem server and install the application dependencies.

Is LLM Chat Replay free to use?

Yes! LLM Chat Replay is open-source and free to use.