Guides15 min read

How to Run a Design Sprint: The 5-Day Google Ventures Method

A day-by-day breakdown of the Google Ventures design sprint — when to sprint, who should be in the room, remote adaptations, and what to do with the results.

By Tim Adair• Published 2025-04-09• Updated 2026-02-12

Jake Knapp developed the design sprint at Google in 2010 and refined it at Google Ventures across over 150 sprints with companies like Slack, Blue Bottle Coffee, and Flatiron Health. The core idea is simple: compress months of debate, design, and prototyping into 5 days, ending with a real prototype tested by real users. If you are deciding between a sprint and a broader design thinking approach, our design thinking vs design sprint comparison breaks down when each one fits.

It sounds aggressive. It works because the time constraint forces decisions that committees would otherwise deliberate for weeks.

Quick Answer

A design sprint is a 5-day process for answering critical business questions through design, prototyping, and testing with customers. Monday: map the problem. Tuesday: sketch solutions. Wednesday: decide on the best approach. Thursday: build a realistic prototype. Friday: test it with 5 users.

Key Steps:

  • Define a focused challenge and assemble a cross-functional team of 5-7 people
  • Follow the 5-day structure — map, sketch, decide, prototype, test
  • Use the test results to decide: build, iterate, or pivot
  • Time Required: 5 consecutive days (no interruptions)

    Best For: Validating new product ideas, resolving stuck decisions, and testing risky concepts before committing engineering resources


    When to Sprint (and When Not To)

    Good Reasons to Run a Sprint

  • You are about to invest a quarter of engineering time on an unvalidated concept. A sprint can validate (or invalidate) the core idea in a week instead of 3 months.
  • The team is stuck in analysis paralysis. You have been debating the same feature for 6 weeks and cannot converge on an approach. The sprint's structured decision-making process breaks the deadlock.
  • You are entering a new market or user segment. You have assumptions about what these users need but no data. A sprint generates real user feedback on a concrete prototype.
  • The stakes are high. A new product line, a major redesign, or a bet-the-company feature. The cost of being wrong justifies a week of focused work.
  • Bad Reasons to Run a Sprint

  • You already know what to build. If the team has alignment on the solution and user research supports it, just build it. A sprint would be theater.
  • The problem is too small. A sprint on "should the button be blue or green?" is overkill. Use A/B testing for small decisions.
  • You cannot get the right people in the room. A sprint without a decision-maker, a design resource, and domain expertise is a waste of time.
  • You are trying to avoid making a decision. "Let's run a sprint" is sometimes code for "I do not want to decide." Sprints are for validating decisions, not avoiding them.

  • The Sprint Team

    Required Roles (5-7 People)

    RoleWhoWhy They Are There
    DeciderProduct lead, founder, or GMMakes final calls when the team disagrees. Without a decider, the sprint stalls on Wednesday.
    FacilitatorPM, design lead, or dedicated sprint masterKeeps the team on track, manages time, enforces the process. Must be comfortable interrupting people.
    DesignerProduct designerLeads sketching exercises, builds Thursday's prototype. Core contributor throughout.
    EngineerTech lead or senior engineerProvides feasibility input. Ensures Thursday's prototype is technically realistic.
    Domain expertCustomer support lead, sales, data analystBrings customer and market knowledge the rest of the team may lack.

    Optional But Valuable

  • Marketing: Useful when the sprint involves positioning or messaging
  • Customer success: Brings frontline customer insight
  • Data scientist: Useful when the sprint question involves metrics or experimentation
  • The Facilitator's Job

    The facilitator is the most important role. They are not a participant — they are the process guardian. Their responsibilities:

  • Manage time ruthlessly (use visible timers)
  • Prevent tangents and extended debates
  • Ensure quieter team members contribute
  • Make the Decider decide when the team is stuck
  • Prepare the room, materials, and schedule before Day 1

  • Day-by-Day Breakdown

    Monday: Map the Problem

    Goal: Align the team on the problem, the target customer, and the specific challenge for the sprint.

    Morning:

  • Set the long-term goal (15 min): Where do you want to be in 6 months? Write it on the whiteboard.
  • List sprint questions (15 min): What are the biggest unknowns? "Can we get users to complete onboarding in under 3 minutes?" or "Will enterprise buyers trust an AI-generated report?"
  • Make a map (30 min): Draw a simple customer journey from discovery to key action. This is not a polished diagram — it is a sketch showing the major steps.
  • Afternoon:

  • Ask the experts (60-90 min): Bring in subject matter experts (support staff, sales reps, engineers) for 15-minute lightning talks. Each expert shares what they know about the problem from their perspective. The team takes "How Might We" notes on sticky notes.
  • Organize HMW notes (20 min): Cluster the sticky notes by theme. Vote on the most important themes.
  • Pick a target (15 min): The Decider selects one target customer and one moment in the journey to focus on for the rest of the sprint.
  • Tuesday: Sketch Solutions

    Goal: Generate a wide range of solutions individually (not in a group brainstorm).

    Morning:

  • Lightning demos (45 min): Each team member presents 1-2 examples of existing products that solve related problems well. These can be competitors, products from other industries, or internal tools. Capture key ideas on a whiteboard.
  • Four-step sketch process (varies):
  • - Notes (20 min): Review the map, sprint questions, and Monday's work. Take individual notes on what resonated.

    - Ideas (20 min): Sketch rough ideas — doodles, flows, wireframes. Quantity over quality.

    - Crazy 8s (8 min): Fold a sheet of paper into 8 panels. Sketch 8 variations of your best idea, one per panel, one minute each. This forces rapid iteration.

    - Solution sketch (60-90 min): Each person creates a detailed, self-explanatory 3-panel sketch of their best solution. This is the artifact that will be evaluated on Wednesday. Sketches are anonymous.

    Important: No group brainstorming. Individual sketching produces better and more diverse ideas because participants are not anchored by the loudest voice. Research from the University of Texas found that individual ideation consistently outperforms group brainstorming on both quantity and quality of ideas.

    Wednesday: Decide

    Goal: Select the solution (or combination of solutions) to prototype and test.

    Morning:

  • Art museum (20 min): Post all solution sketches on the wall. Everyone reviews silently.
  • Heat map voting (15 min): Each person places dot stickers on parts of sketches they find compelling. No discussion yet — this is silent voting.
  • Speed critique (45 min): The facilitator walks through each sketch, highlighting the clusters of dots. The team discusses each one for 3 minutes max. The sketch creator stays silent until the end.
  • Straw poll (10 min): Each team member votes for the solution they want to prototype.
  • Decider decides (5 min): The Decider makes the final call. This might follow the team's vote or override it. That is why you have a Decider.
  • Afternoon:

  • Storyboard (60-90 min): Create a step-by-step storyboard for the prototype. This is the blueprint Thursday's team will follow. It should be specific enough that someone could build the prototype without further discussion.
  • Divide and assign (15 min): Decide who builds what on Thursday. Assign user test logistics (recruiting 5 users for Friday).
  • Thursday: Prototype

    Goal: Build a realistic-looking prototype that is good enough to get honest reactions from real users.

    The entire day is dedicated to building. The key principle: "Goldilocks quality." Not too polished (takes too long) and not too rough (users cannot react to an ugly wireframe). The prototype should look like a real product at first glance.

    Tools that work well:

  • Figma: For interactive UI prototypes
  • Keynote/PowerPoint: Surprisingly effective for clickable prototypes
  • HTML/CSS: For web-based prototypes (if you have a front-end developer)
  • Video: For complex interactions that are hard to simulate
  • Division of labor:

  • Makers (2-3 people): Build the prototype screens
  • Writer (1 person): Creates realistic copy, labels, and content
  • Asset collector (1 person): Finds placeholder images, icons, and data
  • Interviewer (1 person): Prepares the Friday interview script and logistics
  • End-of-day review: The whole team walks through the prototype to check for gaps, inconsistencies, and anything that might confuse test users.

    Friday: Test with Users

    Goal: Test the prototype with 5 real users and identify patterns in their reactions.

    Why 5 users: Jakob Nielsen's research at the Nielsen Norman Group showed that 5 usability tests uncover approximately 85% of usability problems. More users hit diminishing returns. With 5 interviews scheduled at 60 minutes each (with 30 minutes between), you can complete all testing in a single day.

    Interview structure (60 minutes each):

  • Warm-up (5 min): Build rapport. Ask about their role, their current tools, and their workflow. Do not mention your product yet.
  • Context questions (10 min): Explore their current process for the problem you are solving. What do they use today? What frustrates them?
  • Prototype walkthrough (30 min): Show the prototype. Ask them to complete specific tasks while thinking aloud. Do not explain how it works — let them figure it out. Ask: "What do you expect to happen when you click this?" before they click.
  • Debrief (10 min): What was their overall impression? What would they change? Would they use this?
  • The observation room: The rest of the sprint team watches the interviews live (via video stream or two-way mirror). They take notes on a shared document, recording patterns: what confused users, what delighted them, and what they ignored.

    End of day: Review the notes as a team. Look for patterns across all 5 interviews:

  • Strong patterns (4-5 users had the same reaction): High confidence finding.
  • Moderate patterns (2-3 users): Worth investigating further.
  • Weak patterns (1 user): Could be an outlier. Do not overreact.

  • Remote Design Sprints

    Remote sprints work but require more preparation. The process is identical; the tools change.

    Tool Substitutions

    In-PersonRemote Equivalent
    WhiteboardMiro or FigJam
    Sticky notesMiro sticky notes
    Dot votingMiro voting feature
    Sketching on paperSketch on paper, photo and upload
    Physical timerVisible shared timer (Time Timer app)
    War roomPersistent Zoom/Meet room (cameras on)

    Remote-Specific Tips

  • Cameras on, all day: This is non-negotiable. A design sprint requires active engagement, and cameras-off is passive.
  • More breaks: Remote attention spans are shorter. Take a 10-minute break every 60-90 minutes instead of every 120 minutes.
  • Pre-sprint prep: Mail physical supplies (Post-It notes, Sharpies, paper for sketching) to participants a week before. The tactile experience of sketching on paper is important.
  • Facilitator energy matters more: In person, the room's energy carries people along. Remotely, the facilitator must bring the energy.

  • Common Failure Modes

    No Decider in the Room

    The sprint stalls on Wednesday when the team cannot agree on a direction. The Decider is not optional — they must be present and willing to make tough calls.

    Overpolished Prototype

    The team spends 8 hours building pixel-perfect screens and runs out of time. The prototype needs to be realistic enough to test, not perfect enough to ship. If it takes more than 7 hours to build, it is too complex.

    Leading the User Tests

    "So we built this really cool feature that lets you... do you like it?" is not a usability test. The Friday interviewer must be neutral, let users struggle, and resist the urge to explain things.

    Ignoring Negative Results

    The sprint invalidated the idea. The team decides to build it anyway because "the users just did not understand it." This is the most expensive failure mode. Negative results are results. Respect them.

    Skipping the Follow-Through

    The sprint ends Friday. The team goes back to their regular work. The prototype sits in Figma untouched. Assign next steps before leaving on Friday: who owns the follow-up, what is the timeline, and how does this feed into the roadmap.


    After the Sprint

    Scenario 1: The Prototype Worked

    Users understood it, were excited about it, and completed the key tasks. Move to:

  • Detailed design (refine the prototype into production-ready designs)
  • Engineering scoping (how long will this take to build?)
  • Roadmap integration (when does this ship?)
  • Scenario 2: Parts Worked, Parts Did Not

    The concept resonated but specific interactions confused users. Move to:

  • Identify the specific failure points from the Friday notes
  • Redesign those elements
  • Run a follow-up test (3 users) on the revised prototype
  • If it passes, proceed to detailed design
  • Scenario 3: The Prototype Failed

    Users were confused, disinterested, or could not complete core tasks. Move to:

  • Document what you learned (this is valuable data, not wasted time)
  • Revisit the problem statement — was the problem real? Were you solving it for the right user?
  • Decide whether to run another sprint with a different approach or deprioritize the problem
  • The Sprint Summary Document

    Within 3 days of the sprint, the facilitator should write a summary covering:

  • Sprint challenge and target
  • Solution sketches (photos)
  • Prototype screenshots
  • Key findings from user tests (organized by pattern strength)
  • Recommended next steps with owners and timelines
  • Share this with stakeholders who were not in the sprint. It builds organizational support for the direction and creates a reference for future decisions.


    Key Takeaways

  • Sprint on high-stakes, uncertain problems — not on things you already know the answer to.
  • The Decider is non-negotiable — without someone who can make final calls, the sprint stalls.
  • Individual sketching beats group brainstorming — the process is designed to prevent groupthink.
  • Goldilocks prototype quality — realistic enough to test, rough enough to build in a day.
  • Five users uncover 85% of problems — you do not need a large sample to get actionable results.
  • Follow through — the sprint is worthless if the results are not acted on within 2 weeks.
  • T
    Tim Adair

    Strategic executive leader and author of all content on IdeaPlan. Background in product management, organizational development, and AI product strategy.

    Frequently Asked Questions

    How much does a design sprint cost?+
    The direct cost is 5 days of salary for 5-7 people — roughly $15,000-$40,000 in loaded labor cost for a typical tech company. The indirect cost is the opportunity cost of pulling those people off their regular work for a week. The ROI comes from avoiding months of building the wrong thing. If the sprint invalidates a bad idea that would have taken a quarter to build, it just saved you $200K+.
    Can you run a design sprint in fewer than 5 days?+
    Yes. AJ&Smart's 'Design Sprint 2.0' compresses the process into 4 days by combining Monday and Tuesday activities. Some teams run 3-day sprints for smaller problems. But compressing below 3 days eliminates the user testing day, which is the most valuable part. If you cut the sprint short, keep the Friday user tests.
    What happens after a design sprint?+
    Three outcomes are possible: (1) The prototype validated the concept — move to detailed design and engineering. (2) The prototype partially validated — iterate on the specific elements that did not work and re-test. (3) The prototype failed — celebrate that you learned this in a week instead of a quarter, and decide whether to pivot or abandon the idea.
    Free Resource

    Want More Guides Like This?

    Subscribe to get product management guides, templates, and expert strategies delivered to your inbox.

    No spam. Unsubscribe anytime.

    Want instant access to all 50+ premium templates?

    Start Free Trial →

    Put This Guide Into Practice

    Use our templates and frameworks to apply these concepts to your product.