Prompt Playground

Iterate on prompts with curated data from development and production

Prompt Playground helps developers experiment with prompt templates, input variables, LLM models, and parameters. This no-code platform empowers both coding and non-coding experts to refine their prompts for production applications.

Key features

  1. Iterate on your prompts with any model using our AI Provider Integrations

  2. Replay spans from your production data

  3. Build prompts with AI using ourOptimize prompts with Alyx

  4. Manage your prompts in one place with Prompt Hub

Find and fix problematic production examples

The most common way to enter the Prompt Playground is through a span on the LLM tracing page. For instance, users can filter spans where an Online Evaluator flagged the LLM output as a hallucination and then bring one of these examples into the Prompt Playground to refine the prompt, ensuring the LLM produces factual responses in the future.

User selects the Prompt Playground button to import the template, input variables, and LLM output for iteration in the playground.
User modifies the template and input variables in the Playground to refine and iterate on the prompt.

Prompt iteration

You can iterate on prompts by comparing them side by side with different models, tools, LLM parameters, prompt templates, and variables. The first step is to select the "clone prompt" or "+ prompt" button to create a new prompt.

User duplicates Prompt A and then switches the model from gpt-3.5-turbo to gpt-4o for a direct comparison.

Optimize your prompts with Copilot

Another approach to reducing hallucinations is modifying the template. Using Copilot, the user optimizes the prompt, instructing the LLM to respond with 'I don’t know' when the answer is not found in the provided context. After pressing 'Run' with the updated prompt template, the New Output confirms that the LLM now responds with 'I don’t know' instead of generating a fabricated answer.

Optimize the prompt with Copilot

Save prompt template to prompt hub

The template can also be saved to the Prompt Hub, making it especially valuable for production use cases and collaboration.

By clicking on a specific prompt, the user can view its metadata, version history, and the associated prompt template and LLM parameters for the selected version.

Learn more

Last updated

Was this helpful?