drio
Open app

Think MCP Server

Source

Catalog onlyCatalog onlySTDIO

Overview

Think MCP Server is a server application that utilizes Groq's API to interact with large language models (LLMs), providing access to raw chain-of-thought tokens such as r1 or qwen.

To use Think MCP Server, set up the server environment, configure the necessary parameters, and make API calls to interact with the LLMs for processing and retrieving token data.

  • Integration with Groq's API for LLM access - Ability to expose raw chain-of-thought tokens - Supports various configurations for server commands and parameters
  1. Developing applications that require advanced language processing capabilities.
  2. Researching and experimenting with LLMs and their token outputs.
  3. Building tools that leverage chain-of-thought reasoning in AI applications.

Add to your AI client

Use these steps to connect Think MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "think-mcp-server-beverm2391": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-think-mcp-server-beverm2391"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "think-mcp-server-beverm2391": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-think-mcp-server-beverm2391"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "think-mcp-server-beverm2391": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-think-mcp-server-beverm2391"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "think-mcp-server-beverm2391": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-think-mcp-server-beverm2391"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "think-mcp-server-beverm2391": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-think-mcp-server-beverm2391"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "think-mcp-server-beverm2391": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-think-mcp-server-beverm2391"
      ]
    }
  }
}

FAQ

What programming language is Think MCP Server built with?

Think MCP Server is built using Python.

Is there any documentation available for using the server?

Yes, documentation can be found on the GitHub repository.

Can I contribute to the Think MCP Server project?

Yes, contributions are welcome! Please check the repository for guidelines.