Create a Simulation
Understanding Scenarios
A scenario in Agent Simulate is a structured test case for your AI agent. The scenario name and description play a crucial role:- Scenario Name: Acts as a clear identifier that helps the Autoblocks agent understand the type of test being performed (e.g., “Appointment Confirmation”)
- Scenario Description: Provides specific context and goals for the test (e.g., “You should test calling a receptionist to confirm your appointment”)
Scenario Configuration Steps
Creating a scenario involves four main configuration steps:- Personas: Define who will interact with your agent
- Data Fields: Specify the information to be collected or verified
- Edge Cases: Set up challenging scenarios
- Summary: Review and launch the simulation
Setting up Personas
Add and configure personas to test different caller types: Pre-built Personas include:-
Elderly Caller
- Speaks slowly
- May have difficulty hearing
- Occasionally repeats themselves
-
Impatient Caller
- In a hurry
- Frequently interrupts
- Expresses urgency throughout the call
-
Distracted Caller
- Has significant background noise
- Seems to be multitasking during the call
Configuring Data Fields
Data fields define the information your agent needs to collect or verify:-
Field Definition
- Give each field a clear name
- Add a description of what information to collect
- Specify the expected format
-
Variant Types
- Add different ways people might express the same information
- Example for dates: “March 15, 2023”, “3/15/23”, “next Wednesday”
- Each variant tests your agent’s ability to understand different formats
-
Field Properties
- Required vs optional fields
- Validation rules
- Dependencies between fields
Setting up Edge Cases
Edge cases help test how your agent handles challenging scenarios: Common Edge Cases:-
Communication Challenges
- Frequently interrupts
- Provides information out of order
- Has difficulty hearing the agent
- Speaks very quietly/loudly
- Has strong accent
-
Behavioral Scenarios
- Hangs up abruptly
- Is confused about details
- Asks to repeat information
- Has background noise
- Gives inconsistent information
- Create specific test scenarios
- Define unique behavioral patterns
- Add custom validation rules
Running the Simulation
Once configured, you can:- Review all settings in the Summary view
- Save the scenario for future use
- Run the simulation immediately
- Track progress in real-time
- Access detailed results and analytics
Version Control
- Track scenario changes over time
- Easily go back to previous versions
Best Practices
Scenario Design
-
Clear Objectives
- Write specific scenario descriptions
- Define measurable success criteria
- Focus on one main test goal
-
Comprehensive Testing
- Include both common and edge cases
- Test various persona combinations
- Cover different data input formats
-
Maintainable Structure
- Use consistent naming conventions
- Document special test conditions
- Keep scenarios focused and modular
Performance Optimization
- Start with basic scenarios
- Gradually add complexity
- Monitor agent response times
- Analyze success patterns
- Iterate based on results