I was preparing for a lunch-and-learn session on sprint reviews when something clicked. We spend so much time designing software (thinking about user experience, optimizing flows, reducing friction), but we rarely apply that same intentionality to the sprint review itself.
Yet sprint reviews involve some of the most valuable people in the organization. Stakeholders, product owners, business leaders. If we’re going to ask for their time, shouldn’t we design that experience as carefully as we design our software?
The Sprint Review Is an Experience
Here’s what I see happen too often: teams treat the sprint review as a checklist. Walk through every user story. Click through the test environment. Hope nothing breaks. Call it done.
But that’s not designing an experience. That’s just showing up.
When I talk about designing the sprint review, I mean thinking through:
-
What matters most to the people in the room?
-
What will help them make better decisions?
-
What will waste their time or create confusion?
Just like we design software with the user in mind, we need to design sprint reviews with stakeholders in mind.
Focus on the Big Rocks
Not every user story deserves equal airtime. Some features move the needle. Others are necessary but minor.
I learned this the hard way at a client where we’d spend 45 minutes walking through every single story, including things like “Fixed a typo in the confirmation email” or “Updated the color of the submit button.”
Stakeholders would zone out. They’d check their phones. By the time we got to the important stuff, we’d lost them.
I think in terms of “big rocks.” What are the two or three features that actually matter to this audience? Those get the conversation. Everything else gets a quick summary or a screenshot on a slide.
The goal isn’t to prove we worked hard. It’s to demonstrate value and gather feedback on what matters.
Use Safe Demos
Live demos are risky. Test environments are slow. Databases don’t always have the right data. Networks hiccup. Someone forgot to deploy the latest build.
Here’s what I do instead: record videos mid-sprint. As soon as a feature is working, grab a screen recording. Or take screenshots. Then use those in the sprint review when applicable, making time to live demos and test-drives features that need stakeholders’ feedback the most.
Some people push back on this. “But it’s not live! What if stakeholders think we’re faking it?”
I remind them that we’re not optimizing to prove the code works. We’re optimizing for conversation. The goal is to show stakeholders what we built so they can give us feedback, ask questions, and help us make better decisions for the next sprint.
A pre-recorded video keeps the focus where it belongs.
A Technique I Love: Record as You Build
One of my favorite techniques is working with developers mid-sprint. When they finish implementing a feature, I sit down with them, and we record a quick walkthrough together.
The developer shows me how it works. I ask questions. We capture that conversation on video.
Then, during the sprint review, we may play that video. It’s authentic. It shows the feature in action. And it frees up the live conversation for stakeholder questions and feedback, not for clicking buttons and waiting for pages to load.
This also helps developers practice talking about their work in terms of value, not just functionality. Instead of “I added a new API endpoint,” they learn to say, “Now buyers can see real-time inventory from their mobile devices, so they don’t have to call the warehouse.”
Design for the Conversation
At the end of the day, the sprint review is not a demo. It’s a conversation.
Everything we do (choosing which features to highlight, using relevant data, pre-recording videos) is in service of that conversation.
Because that’s where the real value happens. Stakeholders share what surprised them. They ask about risks we hadn’t considered. They connect what we built to market conditions or customer feedback we didn’t have.
That feedback feeds back into the product backlog. It shapes what we build next. It makes the retrospective more meaningful because we can reflect on how well we facilitated that conversation.
If we treat the sprint review like a checklist, we miss all of that.
But if we design it intentionally, we create space for collaboration, learning, and better decisions.
What would change if you designed your next sprint review the way you design your software?





Leave a Reply