Summary state for the requirements analysis phase.
Contains success status, timing, and event aggregation for the analyze agent's activities. Null if this phase wasn't executed in the session.
Total elapsed time for the entire vibe coding session in milliseconds.
Measures the complete duration from session start to final completion, providing insights into overall development velocity and helping identify optimization opportunities in the automated pipeline.
Summary state for the API interface design phase.
Captures the interface agent's performance metrics including API design success, elapsed time, and operation counts. Null if this phase wasn't executed.
Development phase to replay or analyze.
Specifies which phase of the waterfall development process to focus on when retrieving or displaying replay data. Each phase represents a distinct stage in the automated backend development pipeline.
Summary state for the database schema design phase.
Provides metrics on the prisma agent's performance including schema generation success, timing, and iteration counts. Null if this phase wasn't reached.
Project identifier for the replay session.
Unique identifier or name that distinguishes this project's replay data from other sessions in the replay repository.
Summary state for the implementation phase.
Summarizes the realize agent's code generation performance including implementation success, total time, and component counts. Null if this phase wasn't executed.
Summary state for the test generation phase.
Documents the test agent's efficiency in generating e2e tests including success rate, generation time, and scenario counts. Null if this phase wasn't reached.
Aggregated token usage across all AI interactions.
Comprehensive token consumption metrics including prompt tokens, completion tokens, and total usage across all agents and phases. This data helps in understanding resource utilization and optimizing AI model usage for cost-effective development.
AI vendor identifier for the replay session.
Specifies which AI service provider was used in the original vibe coding session, enabling vendor-specific replay analysis and comparison.
Summary statistics and metrics for a vibe coding replay session.
Provides aggregated performance metrics and completion status for each development phase, enabling quick assessment of session efficiency, resource consumption, and overall success. This summary data is valuable for optimization, benchmarking, and identifying potential improvements in the vibe coding pipeline.