Government services should be usable by as many people as possible, including those who are disabled. It’s our sixth design principle.
Impairments come in many forms - including cognitive, motor, visual, physical. The Web Content Accessibility Guidelines (WCAG) give good advice on how to make sure your service is accessible by disabled people.
At the Passport exemplar our developers tested with screen readers and keyboard only access, we had accessibility audits done, and we did research with a range of user groups with access needs.
As part of our research, we had two sessions in the lab focused on people with visual disabilities. The groups wound up having a range of visual impairments, from those with some sight loss to those who were fully blind. They used a range of assistive technologies - screen magnifiers through to screen readers.
I wanted to write up some of what we learned from these two sessions.
Setting up properly takes time
We asked users to bring their own devices, because users of assistive technologies often have custom setups, and we wanted them to use equipment they were familiar with. If you have them use different devices, you're just testing how well they cope with a new device.
Normally our lab sessions last an hour, with 15 minutes in between. That wasn’t enough time for these users though; not because they're necessarily slower, but because their devices had their own set of challenges connecting to our lab equipment and wifi. Booting the machine can take time, and one of the devices took 20 minutes installing a windows update. Several of the users had a tablet with them that they could use as a backup if their main machine didn't work.
Another reason to use their own devices: my colleague John Waterworth notes that users of assistive technologies are often reluctant to upgrade their machines, for fear of something breaking. This way we get to see the real world devices people use, not just perfectly configured ones.
First learning: allow more time for the sessions. An hour and a half would have worked better.
Amazing variety of methods and devices used
There were a great variety of device types, from those using a Mac or a PC to those that used iPhones or iPads exclusively.
Many of our research participants used a screen reader. This is a software application that reads out what's on screen. It can read out not just the content, but the structure of the page. They're very flexible and configurable. Users can have them read out the page in order, they can have them read out just the links, or do things like navigate by headings.
In the lab sessions we saw varied proficiencies and ways of using screen readers: some users had them read out the page so quickly that we couldn’t understand half of what was said; others relied on keyboard shortcuts; while some moved more cautiously through pages.
It seems obvious, but it’s important to bear in mind. It means there’s no one single solution ‘for screen readers’; everything has to work in concert to make a page accessible to users. As an example, good heading structure helps those users who navigate by headings, and those who read the page from start to finish.
Consistency is crucial
The range of users reinforced some of the broader design decisions we’ve made on GOV.UK: clear and concise content that's easy to read; markup that’s semantic and logical. That's something we strive for on GOV.UK and its importance was borne out.
Consistency in page design proved really important. All of our pages basically have a heading, some text or a question and a ‘continue’ button. As users moved from page to page they quickly became familiar with how the page is structured, which makes them more confident about moving through the pages. If the page layout changed frequently that would have affected their confidence hugely.
The pattern of our confirmation pages was important too. The rhythm of label, answer, change button, label, answer, change button, etc, meant users could listen to the information while tabbing through and quickly correct any mistakes they found.
For us, the sessions were hugely valuable. Both to make improvements, but also to validate some of the decisions we made building the service at the outset.
A related benefit – the sessions are great for increasing empathy for your users.