The Startup Guide to Prototype User Testing 
​​​​​​​Guest Blog on Mindtribe | March 2018
Illustration of geometric shapes with reduced opacity layered under black outline drawings of geometric shapes

Illustration by Sucharita Jyothula

When we first started user testing at Moxxly — the startup I co-founded three years ago — we were so excited to get prototypes to our users that we would often (literally) run out the door before each test. While this enthusiasm will take you far, it can also lead to inconsistent results and a mix of data and user quotes that may not drive the design process forward.
Over time, I have found ways to combine the qualitative research approach I studied and taught at the Stanford, with the quantitative research skills that all the engineering and business folks on my team (I’m the only designer) seemed to value so much.
After speaking with design researchers at larger startups, sending a survey out to my team for their input, and running a gap analysis on our user research and testing process, I redesigned the way we approach user testing. I also developed seven principles that ensure we’re making the most of every user session and capturing valuable insights in a way that is digestible, relevant, and actionable.  All on a startup budget!
So if you’re short on time, constantly in a testing cycle, need to get clear direction from user testing, or (still!) trying to convince your team of this thing called user-centered design, read on!
These seven principles will get you and your team ready to make the most of in-person user testing with prototypes. 
Project yourself to the end of your round of user testing. How will you know, exactly, if the test was a success? What will you do if it is? More importantly, what will you do if it’s not? 
Since the end of each round of testing ends in – spoiler alert! – a presentation to my team, I start my user testing with the presentation. What type of data will I be presenting? What questions will my team have? I build out the structure of the story before I even recruit users to help me design the test. 
This helps develop a hypothesis around what the test will achieve, what the success criteria are, and next steps — regardless of outcome. 
If you test as much as we do, you’re always testing! Before our user testing process redesign, some tests would begin to bleed into each other and the results across variables began to blur. No more!  
Now we make sure that each round of testing has clearly defined phases with a discrete beginning and end. The beginning of every round of testing starts with the question you’ve built the prototype to answer and ends with a presentation to the team with the answers to that question.  
Ok, so if you quiz anyone making prototypes for testing they will claim to know this one. But they will also probably be guilty of not actually doing it every time, so it’s worth repeating: 
A prototype is a question and when you’re testing, each prototype should only ask one question at a time. 
Reducing variables in prototypes, users, and administrators of the feedback session will help you and the rest of the product team make informed decisions.
There’s been plenty of discussion on the effects of workspace and productivity. I won’t go much into it here, but I will say that it helps to have a well-organized space, including a dedicated area (even if it’s just one bookshelf like ours) for user testing props. 
We like to keep a “bug-out beta testing bag” – complete with checklist – that’s set up for last-minute, grab-and-go user testing. Our bag of tricks may be different than yours – and yours may not have boobs printed on it – but here’s what you’ll find inside:
Printed consent forms and pen
2 prototypes
Laptop and charger
Mobile hotspot
Stopwatch app (I like MultiTimer)
Backup equipment
Clean up gear
Troubleshooting items
Her thank you gift
Even though you’ll be ready for last-minute testing, don’t run out the door before the prototype has been tested and deemed user-ready. This means each prototype needs a testing protocol (write it down!) and success criteria. You don’t want to learn about a leaking prototype for the first time in front of a user (been there) or that you’ve left without a crucial component (done that).
The last thing you want is to be questioning the data you’re getting or blaming the user for bad test results. After you’ve built out the template of your presentation and designed the test you’ll know exactly with whom you need to test your prototype. If you can get a statistically relevant sampling, go for it! Anything less than that, you’ll have to stick very close to your target end user. At a minimum, test with 5 people. When recruiting, look beyond your company and your gene pool – testing with friends, family, or colleagues will bias your results in ways that will only hurt you later. 
Pro tip: If you have a multi-day test, randomize the testing across users to prevent unintended sequencing results. 
With so many steps in the process, you’ll want to automate as much as possible to save your time and brain power for the analysis.  Look at your process from beginning to end with an eye towards automation. Everything from email communications and presentations, to data collection and quantitative analysis can be templatized or automated to save you time and effort.
Design (and user testing) doesn’t happen in a vacuum. After your analysis, bring your findings back to the team. Or, better yet, bring them along once in a while and close the loop on how the product they’re designing / engineering / marketing is faring out in the wild and how it affects their work (i.e. what they can do to make the next prototype even better).
Make sure when you’re giving your end-of-testing presentation that you deliver your findings in their native language. Team of designers? Use frameworks and visuals. Presenting to engineers? Get ready to dive into the numbers. Sharing with the CEO? Tables and charts! No matter who you’re talking to present clear conclusions and next steps. Refer back to your hypothesis and the successful / unsuccessful next steps you already outlined. Lastly, close it out by asking for feedback. Your team knows what they need to understand your work and make it relevant to them, so ask.
Moxxly’s seven principles of user testing is a result of adapting the process, swapping notes with other startup designers, and years of practice and iteration with real women. It’s a living document, undergoing continuous process improvements. If you’ve brought this process back to your team and have feedback for me, I’d love to hear it.​​​​​​​
This article was originally published on the Mindtribe Blog.
Back to Top