drio
Open app

Fluent MCP

Source

python package for creating MCP servers with embedded LLM reasoning

Catalog onlyCatalog onlySTDIO

Overview

Fluent MCP is a Python package designed for creating Model Context Protocol (MCP) servers that integrate embedded LLM reasoning capabilities, allowing for intelligent interactions and tool management.

To use Fluent MCP, install it via pip and scaffold a new server using the command line interface or programmatically through Python code. You can define embedded and external tools for your server to enhance its functionality.

  • Reasoning Offloading: Efficiently offloads complex reasoning tasks to embedded LLMs. - Tool Separation: Distinguishes between internal and external tools for better management. - Server Scaffolding: Easily create new MCP server projects with a structured setup. - LLM Integration: Connects seamlessly with various language models. - Prompt Management: Supports loading and managing prompts with tool definitions.
  1. Building AI-driven applications that require complex reasoning.
  2. Creating custom tools for specific tasks within an MCP server.
  3. Developing self-improving AI systems that can register and utilize their own tools.

Add to your AI client

Use these steps to connect Fluent MCP in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "fluent-mcp-fluentdata": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-fluent-mcp-fluentdata"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "fluent-mcp-fluentdata": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-fluent-mcp-fluentdata"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "fluent-mcp-fluentdata": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-fluent-mcp-fluentdata"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "fluent-mcp-fluentdata": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-fluent-mcp-fluentdata"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "fluent-mcp-fluentdata": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-fluent-mcp-fluentdata"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "fluent-mcp-fluentdata": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-fluent-mcp-fluentdata"
      ]
    }
  }
}

FAQ

Can Fluent MCP handle multiple LLMs?

Yes! Fluent MCP can integrate with various LLMs, allowing for flexible AI solutions.

Is Fluent MCP open-source?

Yes! Fluent MCP is licensed under the MIT License, making it free to use and modify.

How do I contribute to Fluent MCP?

You can contribute by submitting issues or pull requests on the GitHub repository.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":2936,"uuid":"eb427c0e-c10a-4cbe-b698-030db646f295","name":"fluent_mcp","title":"Fluent MCP","description":"python package for creating MCP servers with embedded LLM reasoning","avatar_url":"https://avatars.githubusercontent.com/u/15721894?v=4","created_at":"2025-03-16T04:18:33.920Z","updated_at":"2025-03-16T06:10:41.350Z","status":"created","author_name":"FluentData","author_avatar_url":"https://avatars.githubusercontent.com/u/15721894?v=4","tags":"fluent_mcp,llm,mcp-servers","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/FluentData/fluent_mcp","target":"_self","content":"$29","summary":"$2a","img_url":"https://camo.githubusercontent.com/cce5a2a14b0faab422e0bfcdc074afb46089831a0bf5930a7d8af3f31b98f847/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4c6963656e73652d4d49542d626c75652e737667","type":null,"metadata":"{\"star\":\"0\",\"license\":\"\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-03-15 18:08:48\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]