Team Benchmarks
Cross-team comparison for learning, not ranking
Velocity (story points)
Cycle Time (days)
lower is betterPR Review Time (hours)
lower is betterDeployments / Week
DevEx Score
All Teams โ Benchmark Data
| Team | Velocity | Cycle Time | PR Review | Deploys/wk | DevEx |
|---|---|---|---|---|---|
| Platform | โ 62 pts | 3.2 d | 6 h | 4 | 78 |
| Mobile | 45 pts | 4.8 d | 12 h | 2 | 65 |
| Data | 38 pts | โ 2.1 d | โ 4 h | โ 6 | 72 |
| Frontend | 54 pts | 2.8 d | 5 h | 5 | โ 81 |
โ = best in class. Green values = best in class. Lower is better for Cycle Time and PR Review.
Team Benchmarks provide performance comparison data across teams. The purpose is learning, not competition โ a team with lower velocity but higher code quality and fewer incidents may be healthier than a team shipping fast with high failure rates. Use benchmarks to start conversations about what's working and identify where teams can learn from each other.
Share one benchmark comparison per retro. Ask: 'Data team deploys 6x/week while Mobile deploys 2x โ what can we learn from their process?' Avoid framing as 'Mobile needs to catch up.'
Present benchmarks alongside context. High velocity with high cycle time may mean large batches โ not necessarily a problem, but worth understanding.
Use benchmarks to identify mentoring opportunities. A team excelling in PR review time can share their approach with teams where reviews are a bottleneck.
Relevant endpoints: team_metrics and company_metrics.