Prompt Tester icon

Prompt Tester

Createdrt96-hub

Enables agents to test language model prompts with different providers and configurations.

About

Prompt Tester provides an MCP server designed to empower agents with the ability to rigorously test and compare language model prompts across various providers like OpenAI and Anthropic. It allows for easy configuration of system prompts, user prompts, and other parameters, enabling developers to optimize LLM interactions. The server supports both stdio and SSE transport and includes tools for comparing prompts side-by-side, managing multi-turn conversations, and discovering available LLM providers and models. This simplifies the process of finding the most effective phrasing, comparing different models, and maintaining context in stateful interactions.

Key Features

  • Manage multi-turn conversations with stateful context
  • Supports configuration of system prompts, user prompts, temperature, and max_tokens
  • Test prompts with OpenAI and Anthropic models
  • Discover available LLM providers and models
  • Compare prompts side-by-side with different providers and models
  • 0 GitHub stars

Use Cases

  • Comparing different models for specific tasks
  • Testing prompt variants to find the most effective phrasing
  • Maintaining context in multi-turn conversations