Raccoon AI MCP Server
Overview
Raccoon AI MCP Server is a server that implements the Model Context Protocol (MCP) to facilitate web browsing, data extraction, and automation of complex web tasks using the LAM API.
To use the server, you need to install it via Smithery or from the source, configure it with your Raccoon Secret Key and Passcode, and then you can start sending prompts to perform various web tasks.
- Enables web browsing and searching capabilities. - Automates form filling and UI navigation. - Extracts structured data based on defined schemas. - Handles multistep processes across different websites.
- Extracting product information from e-commerce sites.
- Summarizing news articles from various sources.
- Collecting and structuring data on specific topics like technology.
- Performing deep searches for detailed reports on subjects.
Add to your AI client
Use these steps to connect Raccoon AI MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"raccoonai-mcp-server-raccoonaihq": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-raccoonai-mcp-server-raccoonaihq"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"raccoonai-mcp-server-raccoonaihq": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-raccoonai-mcp-server-raccoonaihq"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"raccoonai-mcp-server-raccoonaihq": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-raccoonai-mcp-server-raccoonaihq"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"raccoonai-mcp-server-raccoonaihq": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-raccoonai-mcp-server-raccoonaihq"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"raccoonai-mcp-server-raccoonaihq": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-raccoonai-mcp-server-raccoonaihq"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"raccoonai-mcp-server-raccoonaihq": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-raccoonai-mcp-server-raccoonaihq"
]
}
}
}FAQ
What are the prerequisites to use the server?
You need Python 3.8 or higher and a compatible client like Claude Desktop.
Is there a specific installation method?
Yes, you can install it using Smithery or clone it from the GitHub repository.
Where can I find the documentation?
Documentation is available on the Raccoon LAM API and Model Context Protocol websites.