RAT MCP Server (Retrieval Augmented Thinking)
🧠 MCP server implementing RAT (Retrieval Augmented Thinking) - combines DeepSeek's reasoning with GPT-4/Claude/Mistral responses, maintaining conversation context between interactions.
Overview
RAT MCP Server (Retrieval Augmented Thinking) is a server that implements a two-stage reasoning process, combining DeepSeek's reasoning capabilities with various response models like GPT-4 and Claude, while maintaining conversation context.
To use the RAT MCP Server, clone the repository, install dependencies, configure your API keys in a .env file, and build the server. You can then integrate it with Cline for generating responses.
- Two-stage processing using DeepSeek for reasoning and multiple models for response generation. - Maintains conversation context and history. - Supports various models including Claude and OpenRouter models.
- Enhancing AI responses through structured reasoning.
- Providing context-aware answers in conversational AI applications.
- Integrating with development tools for AI-assisted coding.
Add to your AI client
Use these steps to connect RAT MCP Server (Retrieval Augmented Thinking) in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"rat-retrieval-augmented-thinking-mcp-newideas99": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-rat-retrieval-augmented-thinking-mcp-newideas99"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"rat-retrieval-augmented-thinking-mcp-newideas99": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-rat-retrieval-augmented-thinking-mcp-newideas99"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"rat-retrieval-augmented-thinking-mcp-newideas99": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-rat-retrieval-augmented-thinking-mcp-newideas99"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"rat-retrieval-augmented-thinking-mcp-newideas99": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-rat-retrieval-augmented-thinking-mcp-newideas99"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"rat-retrieval-augmented-thinking-mcp-newideas99": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-rat-retrieval-augmented-thinking-mcp-newideas99"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"rat-retrieval-augmented-thinking-mcp-newideas99": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-rat-retrieval-augmented-thinking-mcp-newideas99"
]
}
}
}FAQ
What models does RAT MCP Server support?
It supports DeepSeek, Claude, and any OpenRouter models like GPT-4.
Is there a license for RAT MCP Server?
Yes, it is released under the MIT License.
How do I maintain conversation context?
The server automatically maintains conversation history and includes it in the reasoning process.