https://userresearch.blog.gov.uk/2016/06/08/choosing-the-best-methods-to-answer-user-research-questions/

Choosing the best methods to answer user research questions

When they plan user research, most user researchers start with an objective (eg test the usability of a new online transaction), and probably some more specific research questions.

The next step is to choose appropriate methods, and design research sessions to answer the questions and achieve the objective.

However, if this isn’t done carefully, the different activities in the research sessions may not map clearly to the questions. As a result, user researchers can end up scrabbling around in their data trying to work out how to answer the questions and meet their objective.

To avoid this problem, I take a very structured approach to mapping research questions to methods.

In this blog post I’ll explain how I do that, using GOV.UK Verify as an example.

Assessing the user experience of certified companies

On GOV.UK Verify, we ran a multi-stage process to assess the quality of new certified companies. One of the stages involved lab-based usability testing.

Image showing GOV.UK Verify Certified Companies

To make sure the assessments were fair and effective, we assessed each prototype against a standard set of research questions. We then mapped out exactly how we would gather the data to answer each question.

In usability testing sessions, there are four key ways to gather data:

  • observation
  • interview
  • analytics and data logging
  • questionnaires and rating scales

For our work, we concentrated on the first two of these.

Example research questions

Here are three example research questions:

Research question: Are all users able to create valid, usable access credentials, that conform to the system requirements?
Method of assessment: Observation

Observations:

  • observe number of attempts to create these, and note down any specific issues experienced
Research question: Do users understand how to reuse their verified identity account?
Method of assessment: Interview and observation

Questions/Observations:

  • observe user behaviour during re-use task - do they choose the correct option,  do they remember which company they used, are they able to sign in again?
  • interview (before re-use task): “What’s this account for? What can you use it for?”
  • interview (after re-use task): “What’s this account for? What can you use it for?”
Research question: Do all users know whether or not their identity has been verified?
Method of assessment: Observation and Interview

Questions/Observations:

  • observe users for any spontaneous comments about being verified or not being verified
  • interview (at point of success/failure): “What has happened here?” “What does this mean?”
  • interview: “Talk me through the process you have been through so far”

Make sure you answer your questions

For GOV.UK Verify this approach has really paid off. We’re in a contractual relationship with certified companies, and we need to be really clear how we assess them.

Knowing exactly how we are answering each research question means we know the right places to record observations, ask questions and log usage data. And we get the right data to assess certified companies fairly and consistently.

 

Keep in touch. Sign up to email updates from this blog. Follow Lorna on Twitter.

Featured image from GOV.UK Verify.

3 comments

  1. Comment by Natalie S posted on

    Thank you for this insight - it's a nice concise method to organise your goals and make sure you make the most of a session and I shall definitely borrow this. I particularly like the fact it's clear and easy to follow if you are handing over / taking on research.

    Reply
  2. Comment by Diana Richards posted on

    You might also consider doing qualitative analysis on the customer support e-mails the certified companies receive regarding the verification process.

    Reply

Leave a comment

We only ask for your email address so we know you're a real person