Two weeks ago, Mojo Motors conducted usability tests on a new prototype website. Usability testing is when people (participants) use a product to help its creators determine what works and what doesn’t. It allows the creators (testers) to watch people that are unfamiliar with a website or product basically mess around and complete a series of tasks or scenarios. These tests shed light on how someone in the real-world will likely use the product. It also allows the testers to notice big mistakes that could “make or break” the website.
We wanted to find out if people understood that the Mojo Motors prototype can help car shoppers ‘Follow’ cars to track changes in price. We took participants through the the entire process of shopping for a car from signing up on our website to following cars to receiving price alert emails and finally contacting a dealership.
Keep reading to see how we conducted our usability tests and the awful stock pictures with little-to-no relevance on the subject matter. You can also click on one of the links after the jump to quickly find relevant information because this post is long.
4. The script
Prep like a mofo
The tests were conducted by me, Max Katsarelas and Ola Oladunni, Head of User Experience (neither are pictured or study with books on top of one’s head). We ran tests for two days from 9AM to 6PM with some breaks in between. I read Steve Krug’s Rocket Surgery Made Easy, The Do-It-Yourself Guide to Finding and Fixing Usability Problems before testing. It was by far one of the best resources I used when trying to learn how to conduct a usability test and I definitely recommend it to anyone conducting usability tests.
I also read the SVPG Blog written by Marty Cagan for advice on user testing . If you’re conducting more formal tests, I’d recommend Usability Testing Essentials by Carol M. Barnum. It’s pretty much a college textbook and reads like it too, but it’s the go-to if your company is looking to do very formalized testing. We considered using a company like uTest to help conduct the tests, but we wanted something super cheap and super fast.
Since we planned everything at the last-minute with only a week to get everything together and a $500 budget, our testing was informal. There was no double-sided glass. There was no minion jotting down notes, that is, if you don’t count me. It was simply two men in a room, with a Macbook Pro, some refreshments, a consent/non-disclosure agreement (we combined the two – download it here) and a test participant.
The stuff you’ll need
Our tests were conducted at a day office we rented for $100 a day near Grand Central. This meant it was accessible to participants who work or live near New York City. The tests lasted about 30-45 minutes and each participant was given a $25 Amazon gift card. A gift for participating is a nice touch and pretty much required. We used Amazon cards because they are my fave. We had some soda and water, but none of the participants were thirsty. They probably expected a life size butler statue to hold the refreshments. Should have known.
You’ll want to have a laptop with built-in camera and an external mouse because some people might not be familiar with a track pad. We installed a program called ScreenFlow to record the tests which simultaneously records a participant’s desktop and their facial expressions. We had only planned on using the free demo but to export videos without a watermark we gave in and paid $99. Having a notebook or pen handy also means you can jot down notes. Just try not to be distracting while you scribble away like that kid wearing skinny jeans at Starbucks.
Screen the participants
To find the participants, we ran an ad on Craiglist in “et cetera jobs.”
A screen shot of the ad can be seen below.
Craigslist gets a bad rap but we got a ton of responses and put together over 10 qualified participants that matched our criteria in only two days. Our hope was 5 to 8 participants would show up so it’s always good to book more participants than you might need. To get the right participants you’ll also want to screen each one.
Based on how the participants respond to the emails you’ll know who will make the best fit. I called each participant and asked them questions about their favorite websites, what they do for a living and what kind of computer/phone they use. The questions are important but it’s more important to find out how the person speaks.
Do they sound enthusiastic? Do you think they’ll be able to adequately talk aloud during the user tests? Being able to think out loud is imperative for successful user testing. You want participants who are able articulate their opinions and back them up when prodded with questions like, “Why?” or coaxed to explain more. I organized all the scheduling and participant profiles in a Google document so Ola and other Mojo team members could see the progress.
Ola and I both used a script before conducting the tests. In his book, Krug points out a scripted introduction gives you, the tester, a chance to describe how the test will work to the participant and reinforce the participant is not being tested. The more comfortable the participant, the better feedback you’re likely to receive. During the testing, remember not to lead a participant on. Even though we wanted a participant to sign up or click the ‘Follow’ button we tried not to provoke them to click.
We designed the prototype to take participants through four different scenarios, but before that, we asked the participant to show us how they search for a used car to buy online. The first scenario dropped a participant on a page from organic search. The second scenario dropped participants on the home page based on a word-of-mouth recommendation. The third and fourth scenarios took participants through the process of following cars, tracking price changes, getting price drop alerts and other member-only features of the website. Each scenarios had their own set of questions and flows we hoped a participant would take.
If participants ask a question about a specific feature or button, respond with a question like, “Is that what you think that does?” or “What do you think that might do?” You know how you’re not supposed to answer a question with a question? Forget that crap, this is a usability test. The goal, aside from trying to find out if participants understood what ‘Follow’ did, was to find out their impressions of each page and feature. It’s essential to make participants think through problems or their questions out loud.
Since we conducted the usability test over the course of two days, consider making changes to the prototype, the scenarios or script. We made changes to copy on the website and moved ‘Follow’ and ‘Share’ buttons. We found the changes caused radically different impressions which will help be integral to the final product’s design.
At the conclusion of the test, ask the participant for any feedback they might have. The participants in our usability tests were curious about Mojo Motors and asked if our company was real or when the website was going to be launched.
What to do after testing
You’re not done yet because – pun alert! – there’s still meat left on the bone. I watched each video again and noted the major takeaways, problems or observations the participants expressed. I then put all of that stuff into a document with a summary of how the tests were conducted and profiles of each participant. I also noted the changes between the first and second day, which copy tested better and other points of feedback. This document allowed anyone on the team who wasn’t present for the testing to get a recap without having to sit through a days worth of testing footage. It can also be referenced when we conduct future usability tests.
Here was the outline I used:
- Summary of each scenario
– Differences between prototype on first and second day of testing
– List of major takeaways
– Profile of each participant and a summary of their test
In conclusion, if you read through this entire post thank you. You deserve some sort of refreshment or gift…seriously. In my first and only experience conducting a usability test, it was a massive success. Our participate show-up rate was much higher than the norm. According to Krug and Ola, only about 50% of participants actually show. I found it was much easier to have participants share their opinions and talk through problems or questions out loud than I originally anticipated. I must be one heck of a pre-screener. Anyways, hopefully if you conduct a usability test you will experience the same type of success and if you have any questions, just drop me a line. I’d love to help anyway I can.