LocalMind
LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.
Overview
LocalMind is a local LLM Chat App that is fully compatible with the Model Context Protocol (MCP). It utilizes Azure OpenAI as its LLM backend, allowing users to connect to various MCP Servers.
To use LocalMind, set up your environment by creating a .env file and a config.yaml file in the backend folder. You can then run the frontend in a browser or the Tauri App in development mode with the Python backend.
- Compatibility with Model Context Protocol (MCP) - Integration with Azure OpenAI for LLM capabilities - Local development setup for both frontend and backend
- Building and testing local LLM applications.
- Connecting to various MCP Servers for enhanced functionality.
- Developing chat applications using Azure OpenAI.
Add to your AI client
Use these steps to connect LocalMind in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"localmind-timosur": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-localmind-timosur"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"localmind-timosur": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-localmind-timosur"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"localmind-timosur": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-localmind-timosur"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"localmind-timosur": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-localmind-timosur"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"localmind-timosur": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-localmind-timosur"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"localmind-timosur": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-localmind-timosur"
]
}
}
}FAQ
What is the Model Context Protocol (MCP)?
MCP is a protocol that allows for the integration and communication between different LLM applications and servers.
Is LocalMind free to use?
The project is open-source, and you can use it freely as per the licensing terms.
What backend does LocalMind use?
LocalMind uses Azure OpenAI as its backend for LLM functionalities.