Create a Simulation

Understanding Scenarios

A scenario in Agent Simulate is a structured test case for your AI agent. The scenario name and description play a crucial role:

  • Scenario Name: Acts as a clear identifier that helps the Autoblocks agent understand the type of test being performed (e.g., “Appointment Confirmation”)
  • Scenario Description: Provides specific context and goals for the test (e.g., “You should test calling a receptionist to confirm your appointment”)

These fields help guide the agent’s behavior and set expectations for the simulation.

Scenario Configuration Steps

Creating a scenario involves four main configuration steps:

  1. Personas: Define who will interact with your agent
  2. Data Fields: Specify the information to be collected or verified
  3. Edge Cases: Set up challenging scenarios
  4. Summary: Review and launch the simulation

Setting up Personas

Add and configure personas to test different caller types:

Pre-built Personas include:

  • Elderly Caller

    • Speaks slowly
    • May have difficulty hearing
    • Occasionally repeats themselves
  • Impatient Caller

    • In a hurry
    • Frequently interrupts
    • Expresses urgency throughout the call
  • Distracted Caller

    • Has significant background noise
    • Seems to be multitasking during the call

You can add multiple personas to test how your agent handles different types of callers. Each persona will generate unique simulation scenarios with specific behavioral cues.

Configuring Data Fields

Data fields define the information your agent needs to collect or verify:

  1. Field Definition

    • Give each field a clear name
    • Add a description of what information to collect
    • Specify the expected format
  2. Variant Types

    • Add different ways people might express the same information
    • Example for dates: “March 15, 2023”, “3/15/23”, “next Wednesday”
    • Each variant tests your agent’s ability to understand different formats
  3. Field Properties

    • Required vs optional fields
    • Validation rules
    • Dependencies between fields

Setting up Edge Cases

Edge cases help test how your agent handles challenging scenarios:

Common Edge Cases:

  • Communication Challenges

    • Frequently interrupts
    • Provides information out of order
    • Has difficulty hearing the agent
    • Speaks very quietly/loudly
    • Has strong accent
  • Behavioral Scenarios

    • Hangs up abruptly
    • Is confused about details
    • Asks to repeat information
    • Has background noise
    • Gives inconsistent information

Custom Edge Cases:

  • Create specific test scenarios
  • Define unique behavioral patterns
  • Add custom validation rules

Running the Simulation

Once configured, you can:

  1. Review all settings in the Summary view
  2. Save the scenario for future use
  3. Run the simulation immediately
  4. Track progress in real-time
  5. Access detailed results and analytics

Version Control

  • Track scenario changes over time
  • Easily go back to previous versions

Best Practices

Scenario Design

  1. Clear Objectives

    • Write specific scenario descriptions
    • Define measurable success criteria
    • Focus on one main test goal
  2. Comprehensive Testing

    • Include both common and edge cases
    • Test various persona combinations
    • Cover different data input formats
  3. Maintainable Structure

    • Use consistent naming conventions
    • Document special test conditions
    • Keep scenarios focused and modular

Performance Optimization

  • Start with basic scenarios
  • Gradually add complexity
  • Monitor agent response times
  • Analyze success patterns
  • Iterate based on results