Overview
Wanaku is a Model Context Protocol (MCP) Router designed to connect AI-enabled applications by standardizing how these applications provide context to large language models (LLMs).
To use Wanaku, follow the usage guide provided in the documentation. This guide will help you set up and integrate the router with your applications.
- Standardizes context provision for LLMs through the MCP. - Supports AI-enabled applications for better interaction and performance. - Open protocol that encourages collaboration and development.
- Enhancing AI application performance by providing structured context.
- Facilitating communication between different AI systems.
- Enabling developers to build more intelligent applications using LLMs.
Add to your AI client
Use these steps to connect Wanaku - A MCP Router that connects everything in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"wanaku-wanaku-ai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-wanaku-wanaku-ai"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"wanaku-wanaku-ai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-wanaku-wanaku-ai"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"wanaku-wanaku-ai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-wanaku-wanaku-ai"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"wanaku-wanaku-ai": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-wanaku-wanaku-ai"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"wanaku-wanaku-ai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-wanaku-wanaku-ai"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"wanaku-wanaku-ai": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-wanaku-wanaku-ai"
]
}
}
}FAQ
What is the Model Context Protocol (MCP)?
The MCP is an open protocol that standardizes how applications provide context to LLMs, improving their effectiveness.
Is Wanaku open source?
Yes! Wanaku is open source and contributions are welcome.
How can I contribute to Wanaku?
You can contribute by following the [contributing guide](docs/contributing.md) available in the documentation.