drio
Open app

MCP Assistant Server

Source

An MCP server that provides task analysis and tool recommendation capabilities

Catalog onlyCatalog onlySTDIO

Overview

MCP Assistant Server is an intelligent assistant server based on the Model Context Protocol, providing task analysis, tool recommendation, and context management capabilities.

To use MCP Assistant Server, clone the repository, install dependencies, configure environment variables, and start the server. Clients can connect via StdioTransport to communicate with the server.

  • Task analysis: Extracts keywords, task types, and complexity from user tasks. - Tool recommendation: Suggests the most suitable tools based on task characteristics. - Context management: Maintains task context and records tool usage history. - Large model support: Optionally integrates large language models for more accurate analysis and recommendations. - MCP service discovery: Automatically discovers and integrates other MCP services and tools in the environment.
  1. Analyzing user tasks to provide insights and recommendations.
  2. Recommending tools for software development projects.
  3. Managing context for ongoing tasks and tool usage.

Add to your AI client

Use these steps to connect MCP Assistant Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-assistant-server-lutra23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-assistant-server-lutra23"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-assistant-server-lutra23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-assistant-server-lutra23"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-assistant-server-lutra23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-assistant-server-lutra23"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-assistant-server-lutra23": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-assistant-server-lutra23"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-assistant-server-lutra23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-assistant-server-lutra23"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-assistant-server-lutra23": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-assistant-server-lutra23"
      ]
    }
  }
}

FAQ

Can MCP Assistant Server analyze any type of task?

Yes! It is designed to analyze a wide range of tasks and provide relevant recommendations.

Is there support for large language models?

Yes! Users can enable large model support for enhanced analysis.

How can I contribute to the project?

You can contribute by forking the repository, creating a feature branch, and submitting a pull request.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":3814,"uuid":"0334aff9-47b3-452d-916b-f799c974bbb7","name":"mcp-assistant-server","title":"MCP Assistant Server","description":"An MCP server that provides task analysis and tool recommendation capabilities","avatar_url":"https://avatars.githubusercontent.com/u/166898893?v=4","created_at":"2025-03-21T20:35:09.595Z","updated_at":"2025-03-21T20:38:40.769Z","status":"created","author_name":"Lutra23","author_avatar_url":"https://avatars.githubusercontent.com/u/166898893?v=4","tags":"[]","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/Lutra23/mcp-assistant-server","target":"_self","content":"$29","summary":"$2a","img_url":"https://github.com/Lutra23/mcp-assistant-server/raw/main/docs/images/logo.png","type":null,"metadata":"{\"star\":\"0\",\"license\":\"MIT license\",\"language\":\"TypeScript\",\"is_official\":false,\"latest_commit_time\":\"2025-03-22 05:26:33\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]