Test Assumptions, Get Validation, and Eliminate Business Risk
Validate, Early and Often
We're risk averse; we use concentrated tests while building products to make sure that we're providing value to our users. The fastest way to ensure that we're building the right software is to prototype and test solutions with people before we build out fully fledged features.
Before building a new business, new product, or new feature, we ask ourselves what assumptions we are making about the product and what we want to learn from the users. We design a prototype around these assumptions to either validate or invalidate them. We use an assumptions-test table to guide this process. In the assumptions-test table we list each hypothesis, the corresponding plan to test it, and the outcome we hope to see. This way, we as a team have a clear indicator of whether we're moving in the right or wrong direction.
Based on the assumptions-test table, we formulate an interview script of the questions we want to ask and the actions we want our interviewees to take. The prototype will take shape based on those questions and actions.
A prototype might take the form of:
- Paper sketches
- Invision clickable mockups
- Static HTML & CSS
- A Marketing page
- Facebook or Google Ads
- Phone calls
You'll notice that each of these takes little time to produce. We're looking to build the most focused part of the product or feature, and that might not involve software at all. The goal of these is to make sure that when we do build software, we build the right software.
Test the Prototype
There are several ways to test and learn from our prototype. We use both qualitative interviews and quantitative data to prove or disprove our hypotheses in our assumptions-test table. While we write our interview script, we refer back to the assumptions-test table to ensure we test our biggest assumptions.
For qualitative interviews, we bring in 5-8 people to do user interviews and have them run through the prototype with us.
During the test, one person sits with the participant to facilitate the session. The rest of the team watches from another room so we don't overwhelm or scare the participant. We feel that it's essential for the whole team to watch the interviews as they happen, to build empathy and a shared understanding. We've found that watching the videos after the interviews doesn't have the same effect on teammates.
Alternatively, we might look at quantitative information, like open rates on emails, traffic to a particular website, or click-throughs on ads. These data points stem from our assumptions-test table, and are not for the sake of gathering data. We'll have an idea of what quantifies success before we start collecting that information.
As a team, we sit down and review our learnings and takeaways from the tests.
- Is the problem and context validated?
- Is the user motivation validated?
- Does the user end up with their expected outcome?
Together, we determine what the next steps are for the product. This might take the form of adding new features, removing features, or starting another design sprint to dig deeper into new solutions. It's important to remember that if features or whole solutions are invalidated, we view it not as a failure, but as a positive learning experience.
Our experienced designers & developers can help.
In person, small teams, focused sprints. 5 years & 50+ successful clients.