When they plan user research, most user researchers start with an objective (eg test the usability of a new online transaction), and probably some more specific research questions.
The next step is to choose appropriate methods, and design research sessions to answer the questions and achieve the objective.
However, if this isn’t done carefully, the different activities in the research sessions may not map clearly to the questions. As a result, user researchers can end up scrabbling around in their data trying to work out how to answer the questions and meet their objective.
To avoid this problem, I take a very structured approach to mapping research questions to methods.
In this blog post I’ll explain how I do that, using GOV.UK Verify as an example.
Assessing the user experience of certified companies
On GOV.UK Verify, we ran a multi-stage process to assess the quality of new certified companies. One of the stages involved lab-based usability testing.
To make sure the assessments were fair and effective, we assessed each prototype against a standard set of research questions. We then mapped out exactly how we would gather the data to answer each question.
In usability testing sessions, there are four key ways to gather data:
- observation
- interview
- analytics and data logging
- questionnaires and rating scales
For our work, we concentrated on the first two of these.
Example research questions
Here are three example research questions:
Research question: Are all users able to create valid, usable access credentials, that conform to the system requirements? |
Method of assessment: Observation |
Observations:
|
Research question: Do users understand how to reuse their verified identity account? |
Method of assessment: Interview and observation |
Questions/Observations:
|
Research question: Do all users know whether or not their identity has been verified? |
Method of assessment: Observation and Interview |
Questions/Observations:
|
Make sure you answer your questions
For GOV.UK Verify this approach has really paid off. We’re in a contractual relationship with certified companies, and we need to be really clear how we assess them.
Knowing exactly how we are answering each research question means we know the right places to record observations, ask questions and log usage data. And we get the right data to assess certified companies fairly and consistently.
Keep in touch. Sign up to email updates from this blog. Follow Lorna on Twitter.
Featured image from GOV.UK Verify.
4 comments
Comment by Natalie S posted on
Thank you for this insight - it's a nice concise method to organise your goals and make sure you make the most of a session and I shall definitely borrow this. I particularly like the fact it's clear and easy to follow if you are handing over / taking on research.
Comment by Lorna Wall posted on
Great - I'm glad it is helpful for you! Let us know how you get on.
Comment by Diana Richards posted on
You might also consider doing qualitative analysis on the customer support e-mails the certified companies receive regarding the verification process.
Comment by Paul Bishop posted on
Some useful pointers here - thank you.