drio
Open app

mcp-perplexity-search

Source

🔎 A Model Context Protocol (MCP) server for integrating Perplexity's AI API with LLMs.

Catalog onlyCatalog onlySTDIO

Overview

The mcp-perplexity-search is a Model Context Protocol (MCP) server designed to integrate Perplexity's AI API with large language models (LLMs), providing advanced chat completion capabilities with specialized prompt templates for various use cases.

To use the mcp-perplexity-search, configure it through your MCP client by adding the necessary settings, including your Perplexity API key. You can then generate chat completions using predefined or custom prompt templates.

  • Advanced chat completion using Perplexity's AI models - Predefined prompt templates for common scenarios like technical documentation and security analysis - Custom template support for specialized use cases - Multiple output formats (text, markdown, JSON) - Configurable model parameters (temperature, max tokens) - Support for various Perplexity models including Sonar and LLaMA
  1. Generating technical documentation automatically
  2. Analyzing security best practices
  3. Conducting code reviews and improvements
  4. Creating structured API documentation

Add to your AI client

Use these steps to connect mcp-perplexity-search in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-perplexity-search-spences10": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-perplexity-search-spences10"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-perplexity-search-spences10": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-perplexity-search-spences10"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-perplexity-search-spences10": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-perplexity-search-spences10"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-perplexity-search-spences10": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-perplexity-search-spences10"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-perplexity-search-spences10": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-perplexity-search-spences10"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-perplexity-search-spences10": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-perplexity-search-spences10"
      ]
    }
  }
}

FAQ

**What is required to run the server?**

You need to configure the server with your Perplexity API key and set up the MCP client accordingly.

**Can I use custom templates?**

Yes! The server supports custom templates for specialized use cases.

**What output formats are available?**

You can choose from text, markdown, or JSON formats for the output.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":878,"uuid":"f40d7fbe-c9b2-4a04-860e-9c1361954db2","name":"mcp-perplexity-search","title":"mcp-perplexity-search","description":"🔎 A Model Context Protocol (MCP) server for integrating Perplexity's AI API with LLMs.","avatar_url":"https://avatars.githubusercontent.com/u/234708?v=4","created_at":"$D2025-01-28T02:15:49.277Z","updated_at":"$D2025-02-23T07:12:22.255Z","status":"created","author_name":"spences10","author_avatar_url":"https://avatars.githubusercontent.com/u/234708?v=4","tags":"search,mcp,perplexity,model-context-protocol","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/spences10/mcp-perplexity-search","target":"_self","content":"$29","summary":"$2a","img_url":"https://camo.githubusercontent.com/349041b1df9d1fe9cdfdace3c1f9d49a665a1b537953f49ef5d38f0161be1c9b/68747470733a2f2f676c616d612e61692f6d63702f736572766572732f7a6c7164697a707372392f6261646765","type":null,"metadata":"{\"star\":\"8\",\"license\":\"MIT license\",\"language\":\"JavaScript\",\"is_official\":false,\"latest_commit_time\":\"2025-04-05 12:36:21\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]