
How we meet accessibility needs
How does Opencast meet accessibility needs for our clients? In our new short video, two of our user-centred design (UCD) practitioners offer expert ad...
In our work for government in particular, there is a need for services and products to be accessible – whether through accessibility regulation or technology-specific requirements such as the Technology Code of Practice, which requires new tech to be accessible and inclusive for all users. We work closely with central government delivery teams to help them meet their responsibilities and solve problems to deliver accessible outcomes.
In the same context, we can also point to personas that highlight common barriers that users with disabilities and impairments face when accessing digital services and tips for designing services everyone can use.
Our team of user researchers, informed by strong values around inclusion, help to design products and services to include otherwise excluded groups.
But, in a world of hybrid working where much of the research we run is remote, we have found that our researchers are getting less in-person research experience. We understand that this lack of experience can affect their ability to deliver practical accessibility testing with disabled users – which is a critical element in delivering accessible products and services.
There is no real substitute for getting hands-on experience with users. Being able to moderate a user research session with a participant who uses a screen reader is valuable, and much less daunting after you have done it several times.
We knew that there was limited in-person practical research training available, and that’s why we decided to develop the training ourselves.
As the confidence that researchers have in conducting inclusive research has grown, so has the volume in sessions that includes disabled people on our projects.
Research with disabled people is particularly important in the healthcare space. Adoption of new healthcare technologies can be tricky – with inaccessible technologies and a lack of trust in systems all potential barriers to necessary healthcare services.
In-person research is more expensive, time consuming and takes more organisation –so there is often a significant burden on the user researcher to prove that it is worthwhile. If a user research is not confident in the value of in-person research, they are not going to strongly advocate for it and that will likely affect the amount of in-person work that they do.
We knew that our user researchers had an experience gap in running in-person accessibility testing, testing digital products with users of assistive technology direct. We also knew that user researchers who hadn’t experienced the value of in-person accessibility testing were less likely to champion it as an essential part of the user-centred design (UCD) process and that it could easily fall out of scope from research projects.
Within disability inclusive research, the more time that is spent involving disabled people, the better the understanding of the people and the more professional, adaptive and nuanced the approach will be.
From conversations with our user researchers, we understood their gaps in knowledge on accessibility research – the questions they had were around recruitment, planning, moderation and analysis. When we looked around for available training we found lots available on design and coding for accessibility – - and you can find any number of courses about making sure to include alt text in the design of a website or the differences between screen readers.
But nothing existed on what we needed around the practicalities of working with disabled people. We needed something more practical and nuanced.
This training covered when and how to involve disabled people during the design research and how to approach sampling for recruitment – how to focus on disabilities that interact with digital services. We decided that, for accessibility testing, we should split across disability groups of particular interest:
We understood that researchers may not be familiar with how to tell when a problem with a screen reader is an issue with the way the participant is using the technology as opposed to a technical problem with the digital product. A good grounding in theory and best practice would allow our researchers to go into the next phase of research prepared with a toolbox of knowledge and techniques.
Armed with this knowledge, we put it into practice and did the research. Working with recruitment agencies and charity contacts, we recruited users with a variety of disabilities such as visual impairments/mobility disabilities and people who are deaf or hard of hearing.
Our user research community spans the whole of the UK so we held our research sessions in our London and Newcastle hubs (with future training scheduled to happen in Manchester). We created a research plan focussing on access to healthcare.
Healthcare was a useful focus for our training. Health is a universal experience – especially to disabled people who can have much more interaction with healthcare services than most.
Opencast has also made healthcare a key priority for future client work and has recently expanded its healthcare offer. Our digital expertise is helping healthcare organisations to give patients and the people who support them the services they need. We want to consider digital transformation in healthcare, who that affects and how.
In Opencast’s recent report on patient-centred healthcare, we point out how important it is to recruit a diverse group of people for research – especially those who may face exclusion - “Recognising that people face different barriers and exploring those blockers is essential to plan services that improve access for all”
A group made up of 16 researchers and 16 participants took part in the research across two locations. We set up a testing space with cameras, contacted interpreters and negotiated with building staff. The researchers had a discussion guide focused on two symptom checker websites and one user researcher conducted the testing while the other took notes.
Following the research, researchers collected their notes together and used Word as a low-fidelity analysis tool to explore how to do co-analysis when those around you might not have access to either virtual or physical whiteboarding tools. This helped our researchers get a full picture of an accessible end-to-end research process.
Researchers have been able to advocate for the value of this research to stakeholders. We have also found that practical sessions are a good way for us to remember the nuts and bolts of conducting in person research. We hit our learning targets from the research, but also discovered unexpected benefits like the value of simply working closely with other user researchers.
Our research sessions also helped to confirm a lot of what we know about disabled peoples’ barriers to healthcare:
In one specific example, we talked to a Polish British Sign Language (BSL) user who used the BSL 999 service whenever they had a health problem. As other options for getting timely and accessible routes into healthcare were unavailable to him, this was the only way he could access healthcare.
Our final thought on this: to create inclusive products that work for everyone, it is important to be able talk to a wide range of people with a range of needs. That can only happen if people are equipped and experienced in this work.
Inclusive research is central to the work Opencast user researchers do. Being able to deeply engage with disabled people during our training, as well as our work, means we can put those values into practice.
Loading...