Skip to main content

https://userresearch.blog.gov.uk/2021/11/11/working-with-users-and-economists-to-evolve-the-service-manual/

Working with users and economists to evolve the Service Manual

Posted by: , Posted on: - Categories: Research stories, Service manual, User research

A laptop showing a remote workshop’s collaboration board on improving the Service Manual with dozens of virtual sticky notes on itThis is the second of 2 blog posts. In the first post, we gave an overview of the user research approach we are using to inform and evolve Service Manual guidance.

We shared how this approach underpinned the first rapid evaluative research the team conducted. This was to shape new data publishing guidance in relation to the imminent retirement of the performance platform.

This post is about the second and third tracks of user research and what we are planning next. 

In our team, research tracks that have a focus on the exploratory element will often probe a problem space that there is little understanding of. Specifically this is about acknowledging the wider context of how people interact with the Service Manual and Service Standard. Understanding how these artefacts make a difference to users (on a day-to-day basis) means that we can ensure guidance stays relevant and useful.

Exploratory research to understand the value of the Manual and Standard

We collaborated with an economist at the Central Digital and Data Office (CDDO) to identify if we could measure the value of using the Service Manual and Service Standard.

We were keen to engage an economist in a joint research exploration so they could witness the stories and scenarios unfolding. Hearing all about how our end-users experienced the Service Manual, they effectively brought the value proposition to life.

Meanwhile, the team had been drafting a hypothesis model to represent the ways people use the Service Manual and Standard – and the roles they want it to perform. Our hope was that the close collaboration would enable us to interpret and articulate any new first-hand research evidence better. How did our products provide value? What benefits had they created? What impact had they had on service teams?

Our Service Manual usage survey had given us a group of people who were happy to be involved with further research. From this group, we selected participants with a range of roles and experience from local and central government departments and external suppliers.

Defining a research approach was tricky. We assumed it would be difficult to get stories of service failure through a direct line of research questioning. It would be equally unlikely that people would be able to quantify the benefits of using the Service Manual and Service Standard. 

Therefore, we opted to ask participants to focus on a recent situation when they used the Service Manual to help them to achieve something. We wanted to hear about the kinds of information people wanted and expected to get from the manual. How did the manual (and adhering to the standard) support them with everyday practice in the context of their role and remit?

A scenario helped us get to these things quicker as both types of functional and emotional needs come out. This also made it easier to gather both positive and negative experiences so we could learn more about how the standard and assurance process is perceived.

We learnt that users of the Service Manual and Service Standard value: 

  • having one approach to follow that’s been tried and tested – this provides familiarity, security and confidence
  • the convenience of having service guidance collected in one place
  • the authority of the guidance and being able to use it to add weight to their decisions
  • that the manual and standard increase awareness and adoption of user-centred and agile practices 

Our participants also talked about a variety of functional and emotional benefits from using the Service Manual and Standard.

These translated into 2 overarching areas of economic benefits.

  1. Time savings – the streamlining of business processes and having a consistent digital development environment. It prevents teams from pursuing less efficient development methods while resulting in a similar or better service for a lower cost.
  2. Improvement of digital services – increasing capability in digital service teams, uptake of user-centred design practices and following best practice. Expanding the use of a user-centred design approach to service development will reduce the risk of service failure, where the service does not meet users’ needs. This reduces the potential for wasteful expenditure on low quality services.

How do we ‘measure’ the economic benefits of the service manual?

The economic benefits are being considered in light of the overall service assessment and assurance process. Several factors outside of Service Manual use could affect accuracy.

The crux of the approach is about comparing teams who use the Service Manual and consult the Standard against those who do not. Over a set length of time, you could measure the time taken between a service assessment and the service going live, the productivity of new starters, comparing between services who failed on user needs related points in spend controls and service assessments and then revisited their approach. 

Overall, this exploratory research helped us understand that best practices will become more common using the Service Manual. Thus, digital projects will be of a higher quality. This leads to better outcomes for citizens who engage with digital government services.

Collaborative workshops to gather input from users

In a more targeted effort to engage with all our user groups, we used workshops to hold structured and collaborative discussions with users of the Service Manual.  This is also to start to develop how community contributions to guidance could work in practice.

We know there is no substitute for getting regular exposure to user feedback first-hand. We ran a Services Week session called ‘Tell us what to improve about the Service Manual’ to do exactly that.

We listened to what the attendees said they’d like to improve about the Service Manual – which has helped to validate and prioritise our backlog – and we also asked how they’d like to work with us to help make those improvements. This has set an initial direction for our contribution model.

Services Week also gave us an opportunity to hear from service teams about work they’ve done to build services at rapid speed in response to coronavirus (COVID-19) and Brexit. These teams had to overcome unprecedented challenges, so we were keen to capture how they worked and what they learnt to inform new guidance to help teams in similar circumstances in the future. 

More than 60 people from across the public sector attended a meetup to scope the first version of guidance on ‘making services in an emergency’. The meetup invited presentations as input followed by a practical workshop component – these insights were synthesised and formulated into a rough draft of guidance. 

The draft was being shared with relevant service teams to invite feedback before further iteration and publishing. This is the start of a two-way proactive relationship to collaborate on the guidance that we hope to repeat on other topics.

Using research data to understand user needs and journeys

These areas of focus have contributed to a rich body of raw research data which we will be actively drawing from for our work on the Manual and Standard. We have learned a lot as a team and these insights are informing guidance and our roadmap.

In the coming months, we’ll be working with our users and service communities to define how community contributions might work, using what we have learnt so far as a starting point. 

We are also continuing to proactively identify opportunities for collaboration across the standards and assurance teams at CDDO to plan projects like the creation of personas and mapping of user journeys through assurance. 

If you are interested, join the #servicemanual channel on the cross-government Slack to hear more about our ongoing work.

Sharing and comments

Share this page

1 comment

  1. Comment by Humza Mullick posted on

    One of the key concepts that I have discovered is the notion of 'continuous user research'. UR does not stop once we have reached data saturation. It continues as the project evolves and aims to address questions that wouldn't primarily have been thought about!