drio
Open app

Storm MCP Server with Sionic AI serverless RAG

Source

Catalog onlyCatalog onlySTDIO

Overview

Storm MCP (Model Context Protocol) Server is an open protocol that enables seamless integration between LLM applications and RAG data sources and tools. It implements Anthropic's Model Context Protocol, allowing direct use of the Storm Platform on Claude Desktop.

To use the Storm MCP Server, you need to register on the Storm Platform to obtain an API token. Then, configure the MCP server settings in your Claude Desktop environment by editing the configuration file and adding the necessary JSON settings.

  • Context Sharing: Provides a standard protocol for interaction between LLMs and data sources. - Tool System: Offers a standardized way to define and call tools (e.g., send_nonstream_chat, list_agents). - File Management: Implements file system operations for uploading, reading, and managing files. - API Integration: Connects with Storm's API endpoints to provide various functionalities.
  1. Integrating LLM applications with RAG data sources.
  2. Managing file operations in LLM applications.
  3. Utilizing various tools for enhanced data processing.

Add to your AI client

Use these steps to connect Storm MCP Server with Sionic AI serverless RAG in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "serverless-rag-mcp-server-sionic-ai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-serverless-rag-mcp-server-sionic-ai"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "serverless-rag-mcp-server-sionic-ai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-serverless-rag-mcp-server-sionic-ai"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "serverless-rag-mcp-server-sionic-ai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-serverless-rag-mcp-server-sionic-ai"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "serverless-rag-mcp-server-sionic-ai": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-serverless-rag-mcp-server-sionic-ai"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "serverless-rag-mcp-server-sionic-ai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-serverless-rag-mcp-server-sionic-ai"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "serverless-rag-mcp-server-sionic-ai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-serverless-rag-mcp-server-sionic-ai"
      ]
    }
  }
}

FAQ

Can I use Storm MCP Server with any LLM application?

Yes, it is designed to integrate with various LLM applications that support the Model Context Protocol.

Is there a cost associated with using the Storm Platform?

The platform offers free access for users to create RAG solutions.

How do I obtain the API key?

You can obtain the API key by registering on the Storm Platform and following the instructions provided.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":2594,"uuid":"8dadc1d2-6990-499e-b299-9a2cfdc9fc58","name":"serverless-rag-mcp-server","title":"Storm MCP Server with Sionic AI serverless RAG","description":"","avatar_url":"https://avatars.githubusercontent.com/u/134251674?v=4","created_at":"2025-03-13T09:01:33.206Z","updated_at":"2025-03-22T12:31:54.112Z","status":"created","author_name":"sionic-ai","author_avatar_url":"https://avatars.githubusercontent.com/u/134251674?v=4","tags":"sionic-ai,serverless,rag,mcp,llm,integration","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/sionic-ai/serverless-rag-mcp-server","target":"_self","content":"$29","summary":"$2a","img_url":"https://github.com/sionic-ai/serverless-rag-mcp-server/raw/master/scripts/img.png","type":null,"metadata":"{\"star\":\"20\",\"license\":\"\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-03-11 12:34:16\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]