LLMling
Easy MCP (Model Context Protocol) servers and AI agents, defined as YAML.
Overview
LLMling is a framework for declarative LLM (Large Language Model) application development, focusing on resource management, prompt templates, and tool execution, defined using YAML.
To use LLMling, you can create a YAML configuration file to define your LLM's environment, set up custom MCP (Model Context Protocol) servers, and utilize the command-line interface (CLI) for various operations.
- YAML-based configuration for easy setup of LLM applications. - Support for static declaration of resources, prompts, and tools. - Integration with MCP for standardized LLM interaction. - CLI commands for managing resources, executing tools, and rendering prompts.
- Developing LLM applications with custom resource management.
- Automating tasks using LLMs through defined prompts and tools.
- Creating interactive agents that utilize LLM capabilities for various applications.
Add to your AI client
Use these steps to connect LLMling in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"llmling-phil65": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-llmling-phil65"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"llmling-phil65": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-llmling-phil65"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"llmling-phil65": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-llmling-phil65"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"llmling-phil65": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-llmling-phil65"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"llmling-phil65": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-llmling-phil65"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"llmling-phil65": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-llmling-phil65"
]
}
}
}FAQ
Can LLMling be used for any LLM?
Yes! LLMling is designed to work with any LLM that can interact through the MCP protocol.
Is LLMling open-source?
Yes! LLMling is available on GitHub and is open for contributions.
What programming language is required?
LLMling is written in Python and requires Python 3.12 or higher.