Overview
MCP LLM is a server that provides access to various Large Language Models (LLMs) using the LlamaIndexTS library, enabling users to interact with LLMs for code generation, documentation, and question answering.
To use MCP LLM, update your MCP configuration to include the mcp-llm server, install the necessary dependencies, and start the server using npm commands. You can then send requests to the server to utilize its features.
- Generate code based on descriptions. - Write generated code directly to a specified file. - Generate documentation for existing code. - Ask questions to the LLM for information and clarification.
- Automating code generation for repetitive tasks.
- Creating documentation for codebases to improve maintainability.
- Answering programming-related questions for developers.
Add to your AI client
Use these steps to connect MCP LLM in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"mcp-llm-sammcj": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-llm-sammcj"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"mcp-llm-sammcj": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-llm-sammcj"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"mcp-llm-sammcj": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-llm-sammcj"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"mcp-llm-sammcj": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-llm-sammcj"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"mcp-llm-sammcj": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-llm-sammcj"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"mcp-llm-sammcj": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mcp-llm-sammcj"
]
}
}
}FAQ
What programming languages does MCP LLM support?
MCP LLM can generate code in various programming languages, including JavaScript, Python, and more.
Is MCP LLM free to use?
Yes! MCP LLM is open-source and free to use.
How do I install MCP LLM?
You can install MCP LLM by cloning the repository and running npm install followed by npm run build.