Deepseek Thinker MCP Server
A MCP provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's CoT from the Deepseek API service or a local Ollama server.
Overview
Deepseek Thinker MCP is a Model Context Protocol (MCP) provider that delivers Deepseek reasoning content to MCP-enabled AI clients, such as Claude Desktop. It allows access to Deepseek's thought processes via the Deepseek API service or a local Ollama server.
To use Deepseek Thinker MCP, integrate it with an AI client by configuring the claude_desktop_config.json file with the necessary command and environment variables. You can also run it in Ollama mode or configure it for local server use.
- Dual Mode Support: OpenAI API mode and Ollama local mode. - Focused Reasoning: Captures and provides reasoning output from Deepseek's thinking process.
- Enhancing AI client capabilities with Deepseek's reasoning.
- Supporting complex reasoning tasks in AI applications.
- Facilitating local AI model interactions through Ollama.
Add to your AI client
Use these steps to connect Deepseek Thinker MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"deepseek-thinker-mcp-ruixingshi": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deepseek-thinker-mcp-ruixingshi"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"deepseek-thinker-mcp-ruixingshi": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deepseek-thinker-mcp-ruixingshi"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"deepseek-thinker-mcp-ruixingshi": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deepseek-thinker-mcp-ruixingshi"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"deepseek-thinker-mcp-ruixingshi": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deepseek-thinker-mcp-ruixingshi"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"deepseek-thinker-mcp-ruixingshi": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deepseek-thinker-mcp-ruixingshi"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"deepseek-thinker-mcp-ruixingshi": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deepseek-thinker-mcp-ruixingshi"
]
}
}
}FAQ
What should I do if I encounter "MCP error -32001: Request timed out"?
This error indicates that the Deepseek API response is slow or the reasoning output is too lengthy, causing a timeout.
Is there a specific tech stack used for this project?
Yes, the project uses TypeScript, OpenAI API, Ollama, and Zod for parameter validation.