Skip to main content

What you won’t learn from a prototype

Posted by: , Posted on: - Categories: Tips and techniques

Real users out on the street

I’m working as a user researcher on GOV.UK Verify, the new way for people to prove they are who they say they are when using government services online.

In this post, I’ll share why we’ve found it's important to augment lab research with talking to people using the live service.

A research lab environment has definite advantages

A research lab environment has definite advantages; we do lab-based research at least once every 2 weeks.

The lab provides a controlled setting for conducting sessions. We’re able to easily observe how people interact with our prototype and take note of usability problems around particular areas or features.

Importantly, because our lab has a substantial observation suite, our project team is able to watch sessions in real time. This means that everyone in the project team gets to know our users first hand, not just the researchers.

But lab research does have limitations

Whilst the lab has many benefits, it’s not a true-to-life representation of the context in which users will complete a task.

Our service requires people to answer questions about their personal details, such as their bank account number. Because we’re researching a prototype and recording the session, we don’t use actual details. This makes it hard to get authentic responses about privacy issues.

If people feel uncomfortable about continuing the process, the research facilitator will aim to address these concerns. However, because we want to get feedback on all the latest updates to the prototype, we’ll encourage the participant to keep moving.

I’m very alert to the kind of questions asked. I was uncomfortable, and would've stopped immediately.

We try to overcome these things through good facilitation but, even better, we step away from our prototype and get out of the lab.

Combining quantitative and qualitative approaches

Lab research offered us insight into the participant's conceptual and emotional response to the prototype. We learned about issues which may cause users to drop out.

Analytics helped us understand where users of the live service were dropping out.

We wanted to build on this knowledge by speaking with live users - people who were experiencing the service in their natural context.

To access live users, we chose phone interviews, which were simple to set up, quick to run and allowed us to gather valuable qualitative insights.

Live research deepened our understanding of certain issues

Live-user research broadly supported what we’d discovered in the lab, but it also deepened our understanding of certain issues.

Time and context was important

In the lab, we set participants a task to complete, allowing them enough time to get it done. The lab is a quiet and calm environment, which frees the participant from any distractions.

With live users, we found that people were accessing the service in environments more demanding of their time and attention. People who failed to register, complained of having a lack of time and were accessing the service between other tasks.

It was the middle of work. I spent no more than 5 minutes on it.

In comparison, those that were successful, deliberately set aside ample time.

I didn't know how long it would take. I set aside an hour, and it only took 10 mins.

Trust in certified companies

As part of proving their identity, our service requires people to choose a certified company.

In the lab, we learnt that people are more likely to pick a company they trust. If people don’t know or trust any of the offered companies, the facilitator will acknowledge these concerns and encourage the participant to continue.

Similar to lab research, live-service users chose companies based on trust and were more successful in registration. However, those who had no trust were likely to abandon the process.

I don't know this company, and that's the only reason I left.

Stepping away from our prototype paid off

All research approaches are useful but, by using several different research methods, we were able to get a more indepth and multi-dimensional view of our user's behaviour.

Work with your analytics experts, speak to live users by getting on the phones or into the field, and keep on working in the lab.

One last tip: catch live users straight off the service

Our participants had to go through data protection processes before we could speak to them, and we knew that the longer we left the interviews, the less likely people were to remember what they did.

We made every effort to speak to people as soon after using the service as possible.

This research helped to create the user interface which introduces the service, and the public beta went live on 14 October 2014.

Keep in touch. Sign up to email updates from this blog. Follow Paul on Twitter.

Sharing and comments

Share this page


  1. Comment by Simon H posted on

    Hi Paul, this was an interesting read and I agree with you. The way we did it on Carer's Allowance was by observing our assisted digital pilot sessions. This was like lab research on steroids in a way! We also got an insight into people's emotional state. On Carer's Allowance people are applying for a benefit, they genuinely really need the money they are applying for.

    Dealing with DWP made people anxious, they wanted to prove they were caring and telling the truth and the really brought to life the issues people had with using the service. Difficulties that people can brush aside in the lab as not really mattering had people fretting in a real situation, they wanted and needed to get it right.

    I'd like to hear more about the logistics of this if possible, how did you find the users etc if possible?

    • Replies to Simon H>

      Comment by Paul Chanthapanya posted on

      Hi Simon. Thanks for your comments.

      As part of private beta, we researched GOV.UK Verify with HMRC’s PAYE company car users. This involved setting up a research panel with HMRC, and we invited drivers to trial the new service.

      When users enrolled onto HMRC's service, they were also sent a second email inviting them to join the research panel. The invitation email contained a url to the panel opt-in form, and this url passed their token onto the opt-in form. Once we saw their tokens being used on the service, we used analytics to identify specific groups to target i.e. those who had abandoned the service, and contacted participants accordingly.

      Those wanting to take part could click through to the opt-in age, enter their name, contact details and were given several options to take part in the programme i.e. online survey, telephone or face-to-face interview. All participants were offered a monetary incentive for helping out.