Definition
A sprint review is a time-boxed meeting held at the end of each sprint where the development team demonstrates completed work to stakeholders. It is not a formal presentation -- it is a working session where the team shows what was actually built, stakeholders interact with the product, and the group discusses what to prioritize next.
The Scrum Guide specifies a 4-hour maximum for a month-long sprint, scaled proportionally (so 1 hour for a 1-week sprint). In practice, most teams run reviews in 30-60 minutes. The format is straightforward: the PM sets context on what was planned, the team demos each completed item, stakeholders ask questions and provide feedback, and the PM captures any backlog adjustments.
Why It Matters for Product Managers
Sprint reviews are the PM's primary feedback loop with stakeholders. Without them, stakeholders see the product only at major milestones -- which means surprises, misalignment, and the dreaded "that is not what I expected" at launch. Teams at Spotify and Shopify run reviews consistently because they surface course corrections in days rather than months.
For PMs specifically, reviews serve three functions. First, they create accountability -- if you committed to delivering three stories and only finished one, that is visible. Second, they generate real-time feedback that shapes the next sprint's priorities. Third, they build stakeholder trust by showing consistent progress. A PM who runs good sprint reviews rarely gets asked "what is the team working on?"
The review is also where PMs manage scope expectations. When a stakeholder sees a demo and says "can we also add X?", the PM can immediately frame the trade-off: "We can add X, but it means pushing Y to the next sprint. Which matters more?"
How It Works in Practice
Set context (5 min) -- The PM opens by recapping the sprint goal, what was planned, and any scope changes that happened mid-sprint.
Demo completed work (20-30 min) -- Engineers or designers demo each completed story in the actual product (not slides). Stakeholders interact with the feature directly when possible. Only demo items that meet the Definition of Done -- no half-finished work.
Collect feedback (10-15 min) -- Stakeholders share reactions, concerns, and ideas. The PM captures these but does not commit to anything on the spot. "Good input, I will evaluate that against our current priorities" is the right response.
Review metrics (5 min) -- If relevant, share any data from features shipped in previous sprints. Did the checkout redesign actually improve conversion? This closes the feedback loop and grounds future decisions in evidence.
Preview next sprint (5 min) -- The PM shares a rough outline of what is planned for the next sprint, inviting input that can shape sprint planning.
Common Pitfalls
Turning it into a slide deck presentation. The review should be a live demo, not a PowerPoint. Stakeholders need to see the real product. Slides hide bugs, rough edges, and incomplete thinking.
Demoing incomplete work. Showing half-finished features trains stakeholders to give feedback on unfinished designs, which wastes everyone's time. Only demo what meets the Definition of Done.
Conflating the review with the retrospective. Reviews are about the product (what we built, what to build next). Retros are about the process (how we work together). Mixing them dilutes both conversations.
Skipping the review when "nothing exciting shipped." Bug fixes, performance improvements, and technical debt reduction are worth reviewing. Stakeholders should see the full picture of where engineering effort goes, not just new features.
Related Concepts
Sprint is the iteration that a review concludes -- every sprint ends with a review.
Retrospective typically follows the sprint review and focuses on team process rather than the product itself.
Sprint Planning kicks off the next sprint, often informed by feedback collected during the review.Explore More PM Terms
Browse our complete glossary of 100+ product management terms.