Since defining our core values last year, we’ve been making continual efforts to ensure that they don’t simply become words on a marketing site, but that they underpin everything we do.
Today, I want to do the same for the second of these principles, which is to “design inclusively”.
What does this principle mean?
This principle is one of several linked to our core value of integrity, which we define as reducing harm downstream. Designing inclusively means keeping different user needs at the forefront of our minds to minimise the risk of a poor user experience at best, and harm at worst.
This means that we need to:
- Be aware of how power and privilege prevents participation in online spaces.
- Work to subvert that by actively supporting accessibility and inclusivity.
The thing about designing inclusively is that it isn’t “niche”. Accessibility is usually discussed in relation to serving the needs of users with disabilities, and too often addressed as an afterthought. But trying to improve the usability of a product doesn’t just benefit people with disabilities (though this is of course an entirely worthy goal in itself).
What tends to spring to mind when we think of disability is a permanent disability, such as colour blindness. But as Dave Redfern points out in this post on Inclusive Web Design, we can be more nuanced in our understanding of the term. If we understand the definition of disability as something causing some degree of limitation, then we can recognise disability as being permanent, temporary or situational.
To borrow an example from the aforementioned post, having an amputated arm would be a permanent disability, while a broken arm would be a temporary disability, and being a new parent holding a baby would be situational. In all these cases, the person in question has the use of just one of their arms. So solving the problem for one use case benefits all these users.
And there are even more inclusivity considerations to take into account when designing software, such as the user’s level of computer literacy, their access to hardware and the internet, their language, education, culture, and so on.
This principle in practice
Part of designer Debs’ impact on the team over the last couple of months has been to encourage us to be more consistent about writing user stories, which focuses our attention on the users’ needs over simply imagining what exciting features we could build!
User stories look something like this:
As a user who is hearing-impaired, I want to be able to access a transcript of spoken audio so that I can understand the content in an audio clip.
This is one of many user stories we’ve written related to making Vito accessible for users who are hearing-impaired, have low vision, use a screen-reader, or use a keyboard only, for instance. For the remainder of this post, I’ll be focussing on this user story as an example of our “design inclusively” principle in practice.
Something we implemented early in Vito, for use on our own demo events, was the ability to link out to a URL where live captions could be displayed. This link appears underneath the video and opens up in a new tab. Here’s what it looks like:
Obviously, being simply a link field, you can link out to whatever website the captions will be displayed on. For instance, our captioning provider of choice uses a technology called StreamText, which generates a URL where captions appear in real time, and the colours, font and text size are customisable according to the user’s needs.
This feature works well, but we wouldn’t consider it to be optimal. We built this knowing that shortly afterwards we’d want to deliver a solution that’s integrated within the Vito hub interface. Which brings us to what we’re currently working on.
Vito uses a media player element called Plyr to display livestreams and pre-recorded videos in the hub. We selected this player because it’s highly customisable and has a strong focus on accessibility.
WebVTT captions are supported, meaning that subtitles can be displayed as an overlay on the video, without the need to open a link in a new tab. Here’s a mockup to give a sense of what this could look like:
We’re currently working on adding the ability for an organiser to upload a video and an accompanying .vtt file in the Vito back-end. One limitation of this feature is that it will only work for pre-recorded videos for now, because you obviously need to upload the transcript file in full ahead of time.
In a future product sprint we’ll be looking at how we can iterate on both of these implementations to bring live captions inside the Vito hub interface. In the meantime, it will at least be possible to display captions for both pre-recorded videos and live-streamed content (even though the latter is displayed in a separate tab).
In a future post we’ll share details about how to generate captions for pre-recorded videos, using a handy trick that saves a lot of time over transcribing everything manually, but gives you more control than simply using AI-generated captions. So stay tuned for that!
How captions impact inclusivity
A 2006 UK television access services study by Ofcom found that 7.5 million people (18% of the population) turned on subtitles when watching TV. But of that 7.5 million, only 1.4 million identified as deaf or hard of hearing.
Captions can obviously be critical for deaf people and hard of hearing people, but they can also aid general comprehension in many scenarios, related to both content and context.
- The audio quality is poor or there’s lots of background noise
- The speaker is speaking very quickly or quietly, or otherwise has otherwise hard-to-understand diction
- The content contains lots of unfamiliar terms
- The viewer’s primary language is different to the language in the video
- The viewer is watching in a sound-sensitive environment and is unable or unwilling to turn the volume up
- The viewer’s speakers or headphones are not functioning properly
- The viewer finds it easier to understand or focus when they can see the words written down
On that last point, there is plenty of scientific and anecdotal evidence that closed captions can be helpful for people on the autism spectrum, people with auditory processing disorder, people with various forms of ADHD (although some report that subtitles can actually be more distracting), and people with dyslexia, among others.
I include this information to highlight how a focus on inclusive design can benefit users on a grand scale.
As always, this is an ongoing process. We’re a small and fairly homogenous team, and we appreciate there’s a need to do more user research with diverse groups. We acknowledge our privilege and the biases we no doubt carry that we might not even be aware of, and we’re committed to working to mitigate these.
Our aim is to continually examine and challenge our own assumptions to build software that makes all our users feel actively supported, and our principle of “design inclusively” keeps this at the forefront of our minds.
In a future post, I’ll be speaking about the third of our Design Principles: Protect users’ privacy.