MCP Prompt Tester
An MCP server designed to give agents the ability to test prompts
Overview
MCP Prompt Tester is a simple server designed to allow agents to test prompts with various LLM providers, including OpenAI and Anthropic.
To use MCP Prompt Tester, install the server using pip or uv, set up your API keys in a .env file or as environment variables, and start the server. You can then use the provided tools to test prompts.
- Test prompts with OpenAI and Anthropic models - Configure system prompts, user prompts, and other parameters - Get formatted responses or error messages - Easy environment setup with .env file support
- Testing different LLM prompts for accuracy and performance.
- Experimenting with various configurations to optimize responses.
- Integrating prompt testing into larger applications or workflows.
Add to your AI client
Use these steps to connect MCP Prompt Tester in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"prompt-tester-rt96-hub": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-prompt-tester-rt96-hub"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"prompt-tester-rt96-hub": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-prompt-tester-rt96-hub"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"prompt-tester-rt96-hub": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-prompt-tester-rt96-hub"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"prompt-tester-rt96-hub": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-prompt-tester-rt96-hub"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"prompt-tester-rt96-hub": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-prompt-tester-rt96-hub"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"prompt-tester-rt96-hub": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-prompt-tester-rt96-hub"
]
}
}
}FAQ
What LLM providers can I use with MCP Prompt Tester?
You can use OpenAI and Anthropic models.
How do I set up my API keys?
You can set up your API keys using environment variables or by creating a .env file in your project directory.
Is there a sample code for using the prompt testing tool?
Yes! The documentation includes an example of how to use the MCP client to call the prompt testing tool.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":1678,"uuid":"f51c8770-24ba-4701-b6fe-a9da217ef857","name":"prompt-tester","title":"MCP Prompt Tester","description":"An MCP server designed to give agents the ability to test prompts","avatar_url":"https://avatars.githubusercontent.com/u/85904046?v=4","created_at":"2025-03-05T04:41:18.032Z","updated_at":"2025-03-12T10:19:30.938Z","status":"created","author_name":"rt96-hub","author_avatar_url":"https://avatars.githubusercontent.com/u/85904046?v=4","tags":"prompt-tester,MCP,prompt-testing","category":"developer-tools","is_featured":false,"sort":1,"url":"https://github.com/rt96-hub/prompt-tester","target":"_self","content":"$29","summary":"$2a","img_url":null,"type":null,"metadata":"{\"star\":\"0\",\"license\":\"MIT license\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-03-04 22:29:46\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]