TxtAI Assistant MCP
Model Context Protocol (MCP) server implementation for semantic vector search and memory management using TxtAI. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic vector database search capabilities. You can use Claude and Cline AI as well.
Overview
TxtAI Assistant MCP is a Model Context Protocol (MCP) server implementation designed for semantic search and memory management using the txtai framework. It provides a robust API for storing, retrieving, and managing text-based memories with advanced semantic search capabilities.
To use TxtAI Assistant MCP, clone the repository, run the start script to set up the server, and configure it with environment variables. You can then integrate it with AI assistants like Claude and Cline to enhance their memory and search functionalities.
- Semantic search across stored memories - Persistent storage with a file-based backend - Tag-based memory organization and retrieval - Memory statistics and health monitoring - Automatic data persistence and comprehensive logging - Configurable CORS settings - Integration with Claude and Cline AI
- Enhancing AI assistants with semantic memory capabilities.
- Storing and retrieving important information in a conversational context.
- Managing and organizing text-based memories for various applications.
Add to your AI client
Use these steps to connect TxtAI Assistant MCP in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"txtai-assistant-mcp-rmtech1": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-txtai-assistant-mcp-rmtech1"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"txtai-assistant-mcp-rmtech1": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-txtai-assistant-mcp-rmtech1"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"txtai-assistant-mcp-rmtech1": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-txtai-assistant-mcp-rmtech1"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"txtai-assistant-mcp-rmtech1": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-txtai-assistant-mcp-rmtech1"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"txtai-assistant-mcp-rmtech1": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-txtai-assistant-mcp-rmtech1"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"txtai-assistant-mcp-rmtech1": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-txtai-assistant-mcp-rmtech1"
]
}
}
}FAQ
Can TxtAI Assistant MCP be used with any AI assistant?
Yes! It can be integrated with Claude and Cline AI for enhanced functionalities.
Is there a specific Python version required?
Yes, Python 3.8 or higher is required to run the server.
How is data stored in TxtAI Assistant MCP?
Data is stored in JSON files within the data directory, allowing for easy management and retrieval.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":1074,"uuid":"fe18e8f4-1bc1-4f95-9c1d-67ce568dc88a","name":"txtai-assistant-mcp","title":"TxtAI Assistant MCP","description":"Model Context Protocol (MCP) server implementation for semantic vector search and memory management using TxtAI. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic vector database search capabilities. You can use Claude and Cline AI as well.","avatar_url":"https://avatars.githubusercontent.com/u/58761064?v=4","created_at":"2025-01-28T02:29:45.513Z","updated_at":"2025-02-23T07:18:50.218Z","status":"created","author_name":"rmtech1","author_avatar_url":"https://avatars.githubusercontent.com/u/58761064?v=4","tags":"python,mcp,cline,vector-database,llm,claude-desktop,mcp-server","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/rmtech1/txtai-assistant-mcp","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"4\",\"license\":\"View license\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-04-22 13:59:25\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]