What are the pros?
- In-depth feedback from real people
- See what roadblocks visitors face
- Serves a diverse set of goals
- It's a scalable process
What are the cons?
- Difficult to share results
- Costly at about $50 each
- Time-consuming process
- Opinion of one tester
What data will be analyzed?
Videos of users
Each user test consists of one or more tasks. These are assignments that are given to the test subjects, phrased and set up in a way that allow the researchers to learn about the website. While the users complete each task, their movements, vocal comments and sometimes face (through a webcam) are recorded for later reference. These tasks can range from simple 'Surf to the page for product X' to complex 'Order two matching items of clothing and complete the checkout'.
In order to structure the findings that surface during a user test, annotations can be very helpful. These written statements (often comments made by test subjects) can later serve as guidelines for making changes to the website or as A/B testing ideas. Some testing platforms such as UserTesting.com allow you to link annotations to specific points in time of user testing video. For platforms that don't allow such linking, a plain text editor or shared document (using a service like Google Drive) should also do the trick.
These are another useful feature that some user testing platforms offer. Such clips contain a segment of the video that show a user performing (or failing to perform) a task. These clips can then be saved for later reference, or be emailed to other members of your team.
Questions after testing
User testing researchers also have the option to ask several questions of users after they have completed their tasks. By asking the test subject specific questions about their experience on the website, you can learn how you can improve the website further.
Questions in moderated sessions
If a moderated setting is used for the test, asking questions from test subjects becomes easier and more interactive. For instance, if you see a user struggling with a particular task, you can ask them what it is they were expecting to happen once they clicked a given link.
Example questions for unmoderated sessions
- What frustrated you most about this site?
- If you had a magic wand, how would you improve this site?
- What did you like about the site?
How effective will this be?
Complexity of the tasks
The effectiveness of a given user testing session depends in part on the complexity of the tasks. For instance, if the task involves completing the checkout of a website, that would be perceived as a complex task to some visitors. Also, task complexity increases significantly when multiple steps are involved in one step.
Lastly, if the tasks require a given mindset (for instance, getting a mortgage online), it would be wise to either select test subjects from a specific target audience, or ask users to empathize as well as they can.
Another aspect that can make or break a user test is the researcher. His/her experience is a vital element in determining which tasks to assign and which questions to ask. It will also be needed in finding out whether to run the user test moderated/unmoderated and remote/in-house.
Getting valuable findings
Furthermore, even after the testing is completed, it is up to the researcher to distil the most valuable findings from the testing sessions.
The final aspect that will be discussed is the test subjects. Whether or not test subjects belong to the target audience can make a great difference in the effectiveness of a user test. Just imagine a geeky male programmer trying out a fashion design tool targeted at elderly women.
Having proper equipment (such as a good microphone) is vital when analyzing user tests. Other aspects might include a computer that is up for the task and an up-to-date browser.
Level of experience
Test subjects that are more experienced with user testing often garner better results. Because these users are trained to voice their thoughts, and explaining their thinking process, seeing them complete the tasks often results in better findings.
Which findings will it deliver?
✓ Learn what roadblocks visitors face on your website
you just see them struggling, things you thought were crystal clear are "totally impossible" to them. While only one tester each, some feedback can be applied immediately, while others serve as great A/B testing ideas.
✓ Get suggestions on how to improve the website
subjects often voice suggestions during the taped sessions, the task + questions model also allows you to ask questions about how to improve. While not always the most useful suggestions (often just based on their own preferences), sometimes can be valuable input.
Should the user test be moderated or unmoderated?
In this distinction, moderated user tests are those that are actively monitored by a researcher. Benefits of moderated users tests are, for instance, that they allow users to ask questions or enable the researcher to clarify a task that, unfortunately, wasn't clear to a user.
Unmoderated user tests, on the other hand, do no have such a moderator present. They are built on the premise that the tasks and questions are clear, and users should be able to deal with them on their own, just like normal website users would. Benefits of an unmoderated user test include lower costs and a fairer test (after all, there is no interference from the researcher).
Will I get better results from remote or in-house tests?
In-house tests would allow you to see a test subject, and possibly even record his facial expressions using a webcam. They also allow for easier moderation of a session, if a moderated session is desired. In addition to the previous arguments, in-house user tests also level out the software and hardware that testers are using.
However, remote tests allow users to test the website in their own environment, rather than facing a new environment, which could skew the results. It would also allow users to take a test whenever they feel like it, without being disturbed.
Can't I just watch a friend surf the website?
That would be a nice addition to user tests that are set up by professionals. Unfortunately, this tactic probably isn't scalable. After all, how many friends and colleagues does one have?
Additionally, a friend or even colleague probably already knows about the company and what it does. Therefore, it doesn't allow the researcher to learn about the first impression of the website. Perhaps has or she has even visited the website before. All this should be taken into account when drawing conclusions from such an ad-hoc user testing session.