Moderated usability testing is a commonly used technique in user research. It involves watching users trying to complete relevant tasks on your product or service. It will show you how well your product or service works, and help you identify specific usability issues and insights.
In principle, moderated usability testing is a straightforward process. Typically, you get some participants, give them some tasks and often ask them to think aloud while they are solving the tasks. In practice, it is much more nuanced and complicated.
If done right, usability testing can give you high-value findings that will help you identify and address issues with your product or service. It will show you how it might work in the real world.
But, if not done right, research participants may not interact with your thing in a natural way. This could result in unreliable and invalid findings. It could also lead you to either miss important problems, or have false alarms and try to fix the wrong thing.
This blog post will give you some practical tips on how to carry out moderated think-aloud usability testing. It’s based on what I’ve learnt from academic research and from practical experience, both inside and outside government.
Put the participants at ease
This is something you need to think about before you start the testing. You’ll get the best results if participants feel at ease and are comfortable, so always make sure you never skip this step, even if under time pressure.
- Try getting to know participants before you start. You could ask them, ‘how was your journey here?’ or ‘how has your day been so far?’
- Ask an unusual question first to get people thinking. Try ‘you’ve won the lottery and are travelling the world – where would you like to go first?’
- Make sure participants aren’t worried about offending you. Say ‘we didn’t design this, so please don’t worry that what you say might upset us.’
- Make sure participants do not feel they are being tested, which can cause anxiety and change their behaviour. Say 'it's not a test of you – we just want to understand how the service works for you'. Avoid using phrases like ‘usability testing’ or ‘test’ – say ‘study’ or ‘research’ instead
Don’t steer participants
Remember that you need to keep the usability testing as objective as possible. You need to ensure that participants are telling you what is going through their minds spontaneously, not what they think you want to hear.
- Try using neutral instructions when introducing the purpose of the study – for example, say ‘we want you to say out loud everything that is going through your mind’, rather than presenting specific areas you are interested in
- Make sure your tasks do not contain task cues or specific actions that would either lead participants to success or bias their natural way of working. It’s useful to have tasks based on actual user goals. For example, if you want to understand how useful the ‘search’ function is, avoid using tasks like, ‘you want to search for survey software’, and use ‘you need to get something to conduct a survey’ instead
- Use neutral acknowledgements to show you’re listening, like ‘uh huh’ or ‘mmm’ (but don’t overdo it)
- Avoid using words like ‘yes’ and ‘right’, as this may make participants feel you agree with what they are doing or saying
- Make sure you don’t correct or challenge what people say or do
- Conceal your thoughts about participants’ approaches – do not say things like ‘that’s surprising’ or ‘I haven’t seen anyone do that before’
Be cautious about when and how to prompt
You should spend most of your time observing and listening, and be cautious about when and how to prompt. Each time you prompt, you introduce the risk of unintentionally biasing and interrupting participants’ from their natural workflow, so it’s important to know when you should prompt to get useful information.
- Try giving participants 15-20 seconds if they fall into silence, then say ‘what are you thinking now?’
- Avoid using the word ‘why’ to elicit reasons. Use ‘because…’ or ‘could you explain a bit more?’
- Do not complete participants’ sentences if their reply is incomplete. Instead, you can simply repeat the keyword with an intonation. This also works well as a way to get an explanation
- Avoid stopping participants from what they were doing in order to examine a specific feature or area that they missed
I find it very useful to prompt based on behaviour and/or comment triggers, and I’ll share this in detail in a follow-up blog post.
It’s important to follow a process for usability testing, but you also need to be flexible and respond to participants and scenarios you might find yourself in.
- If appropriate, try personalising the session according to each participant. For example, if a participant has failed several tasks in a row, try adding in an easy one, or if a participant accidentally succeeds in a task, try repeating it a bit later by changing the task slightly
- Change task details or customise questions to make them specific to each participant. For example, if the task is about visa requirements, you could tailor it to the country the participant is from or has recently visited to make the task more relatable for them
How do you do usability testing?
I hope this helps you when you next carry out usability testing. I’ll write several other blog posts to explore specific areas in more detail.
If you have any advice from your experiences or want to know more about usability testing please let us know in the comments.