Unsloth MCP Server
An MCP server for Unsloth - a library that makes LLM fine-tuning 2x faster with 80% less memory
Overview
Unsloth MCP Server is a server designed for the Unsloth library, which enhances the fine-tuning of large language models (LLMs) by making it twice as fast and reducing memory usage by 80%.
To use the Unsloth MCP Server, install the Unsloth library, build the server, and configure it in your MCP settings. You can then utilize various tools for model loading, fine-tuning, and text generation.
- Optimized fine-tuning for various models including Llama and Mistral. - 4-bit quantization for efficient training. - Extended context length support. - Simple API for model operations. - Export capabilities to multiple formats.
- Fine-tuning large language models on consumer GPUs.
- Generating text with fine-tuned models.
- Supporting custom datasets for training.
Add to your AI client
Use these steps to connect Unsloth MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"unsloth-mcp-server-ototao": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-unsloth-mcp-server-ototao"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"unsloth-mcp-server-ototao": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-unsloth-mcp-server-ototao"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"unsloth-mcp-server-ototao": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-unsloth-mcp-server-ototao"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"unsloth-mcp-server-ototao": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-unsloth-mcp-server-ototao"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"unsloth-mcp-server-ototao": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-unsloth-mcp-server-ototao"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"unsloth-mcp-server-ototao": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-unsloth-mcp-server-ototao"
]
}
}
}FAQ
Can I use Unsloth with any model?
Yes, Unsloth supports multiple models including Llama, Mistral, and others.
Is there a specific hardware requirement?
Yes, it is recommended to use an NVIDIA GPU with CUDA support for optimal performance.
How do I troubleshoot common issues?
Common issues include CUDA out of memory errors, which can be resolved by adjusting batch sizes or using 4-bit quantization.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":2113,"uuid":"ac9d9dfb-3465-4ce7-8ddd-3e06ad8fc366","name":"unsloth-mcp-server","title":"Unsloth MCP Server","description":"An MCP server for Unsloth - a library that makes LLM fine-tuning 2x faster with 80% less memory","avatar_url":"https://avatars.githubusercontent.com/u/93845604?v=4","created_at":"$D2025-03-09T03:36:21.815Z","updated_at":"$D2025-03-12T10:18:20.419Z","status":"created","author_name":"OtotaO","author_avatar_url":"https://avatars.githubusercontent.com/u/93845604?v=4","tags":"[]","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/OtotaO/unsloth-mcp-server","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"0\",\"license\":\"\",\"language\":\"JavaScript\",\"is_official\":false,\"latest_commit_time\":\"2025-03-08 15:18:42\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]