drio
Open app

Prompt Decorators

Source

A standardized framework for enhancing how LLMs process and respond to prompts through composable decorators, featuring an official open standard specification and Python reference implementation with MCP server integration.

Catalog onlyCatalog onlySTDIO

Overview

Prompt Decorators is a standardized framework designed to enhance how Large Language Models (LLMs) process and respond to prompts through composable decorators. It includes an official open standard specification and a Python reference implementation with Model Context Protocol (MCP) server integration.

To use Prompt Decorators, install the package via PyPI and load available decorators. You can create decorator instances and apply them to prompts to modify LLM behavior. For example, you can prefix prompts with annotations like +++Reasoning to control AI responses.

  • Standardized syntax for modifying LLM behavior - Registry-based management of over 140 pre-built decorators - Parameter validation and type checking for decorators - Integration with MCP for enhanced functionality - Extensive documentation and examples for users and developers
  1. Crafting prompts for specific reasoning patterns.
  2. Structuring outputs in particular formats.
  3. Ensuring consistent responses across different AI models.

Add to your AI client

Use these steps to connect Prompt Decorators in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "prompt-decorators-synaptiai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-prompt-decorators-synaptiai"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "prompt-decorators-synaptiai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-prompt-decorators-synaptiai"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "prompt-decorators-synaptiai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-prompt-decorators-synaptiai"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "prompt-decorators-synaptiai": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-prompt-decorators-synaptiai"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "prompt-decorators-synaptiai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-prompt-decorators-synaptiai"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "prompt-decorators-synaptiai": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-prompt-decorators-synaptiai"
      ]
    }
  }
}

FAQ

Can I use Prompt Decorators with any LLM?

Yes! Prompt Decorators are designed to work across various LLM platforms.

Is there a cost to use Prompt Decorators?

No, Prompt Decorators is open-source and free to use.

How do I contribute to the project?

Contributions are welcome! Please refer to the contributing guidelines in the repository.