There are a variety of online user testing platforms that provide quick and inexpensive access to remote observational research. This mode of user testing is fairly new to me, so I recently ran a small experiment to compare online vs. in-house user testing.
How I chose an online platform
Earlier this year I set out to research how other design firms in Brisbane were conducting user testing. Based on anecdotal evidence from conversations with design professionals at the Brisbane Product Design Group meetups, I learned that UserTesting.com and Validately.com were the current favorites.
I settled on using Validately as they were able to accommodate my request to run a trial, unmoderated test with 10 participants.
Full disclosure: despite the “free trial” promised on their website, I actually had to sign up for a month ($99). However, Validately credited my account for $100 (which equated to the cost of sourcing 10 participants from their testing panel at $10 each).
How I structured the comparative test
To compare my online testing experience against a familiar baseline, I decided to repeat a test I ran in-house in April using Validately. In both of these tests, the two tasks were identical:
- Task 1: crop the image to show only the girl and the balloon
- Task 2: re-crop the image to show only the balloon
In the in-house test, I asked a couple of questions as the participants were completing Task 2:
- Question 1: What do you expect to happen when you click away (before applying the crop)?
- Question 2: What did you expect to happen when you re-crop the (cropped) image?
In the online test, I changed the second question. Based on my observations from the in-house testing sessions, I noted that this issue was a bit hard for participants to understand. Most needed an explanation of what this question meant—as I was unable to find an easy way to express this as a survey question, I decided to drop it. I replaced question two with:
- New Question 2: Were you able to perform the tasks? If yes, were they intuitive?
- New Question 3: Do you have any suggestions for making this feature easier to use? Add your comments.
Setting the test up on Validately.com
Setting the test up on Validately.com was easy, as you will see from the following three screenshots:
In my haste, however, I did not do a dry-run of the test with one or two colleagues, and it resulted in a great lesson learned. Namely, it is a good idea to run a live test preview to iron out any kinks in your online test.
When conducting a test in-person, you can make small micro-adjustments to suit any deficiencies in your test. Online testing does not provide this luxury. In my case, the two questions for task two did not display until the participant finished the task – leading to a bit of confusion for the participants during the test session. I could easily have avoided this if I had run a live preview prior to launching the test.
The second lesson highlighted the amount of care that needs to be taken when asking survey questions. As I analyzed the screen-capture videos from all 10 participants, I was amazed that despite the fact that most (8 out of 10) participants struggled to find the feature we were testing them on, 9 of them rated the feature as “intuitive” to use. This observation makes a strong case for using observational research (whether online or in-person) as part of your overall user-testing strategy. In this case, what the participants reported was in direct opposition to what they experienced.
(As an aside, the reason for the above discrepancy is likely tied to the psychology of “not wanting to get it wrong”. Listening to the audio runs from the tests, it is possible to detect changes in the tone of voice and tell-tale laughter that accompanies a state of participant discomfort. In the in-person tests, there is an opportunity to acknowledge this, and dig deeper. In an unmoderated test, one can only recognize this and make allowances when analyzing the results.)
Analyzing the results on Validately.com
After launching the test on Validately, I had all the participant responses in my inbox within a few hours. To me, the low cost per participant and fast turnaround times are the biggest advantages of online user testing.
Analyzing results on Validately.com is a straightforward process. You can view participant videos, augment them with (video time stamped) notes and save video segments as separate clips. I found that the interface lacked some basic functionality related to navigating through a result set. However, I was told that this will be addressed in the near future. The screenshot below shows a listing of my video clips (excerpts).
Comparison of in-house and online results
In terms of the actual results from the user tests, the in-house and online tests were similar. The main thing we learned from both sessions is that the participants had a lot of trouble in finding the feature that we were asking them to test. This was immediately clear from watching the video footage, and supported by the average time taken to complete the task (in-house: 2:11 mins; online: 1:58 mins). For a simple task like image crop, these results were not acceptable – an outcome that we translated into changes in the product’s design.
The results from the second part of the task were inconclusive. Whereas in-house, all participants stated that they expected the feature to cancel on click-away. In the online session, they presented us with evenly split answers. It is difficult to tell whether the question was unclear and/or misunderstood by the participants in the online session, or whether the results from the in-house test were not representative of a larger sample.
The verdict on online testing
Based on my comparison of running a user test in-house and again online on Validately.com, my thoughts are:
- Online user testing is an extremely efficient (time and cost) way to validate hunches or hypotheses that may have arisen out of observational research with a smaller participant set
- The structure of an online test is less forgiving than when the test runs in person as there is no opportunity to make micro-adjustments if things are unclear
- It is a little harder to dig into the reasons underlying participant behavior when conducting an unmoderated online test (moderated online sessions are an option that I have not investigated as yet)
Overall, I see online user testing as an important part of our overall testing strategy here at Ephox.
Do you have experience running online user tests? Please share these here so we can all learn from them!