So you’ve spent a bunch of time and money to create a supremely awesome app. It arrived to great fanfare, and now you have thousands or millions of happy users. Everything in the world is good.
But what you may or may not know, is that among those thousands or millions of users, there might be more than a few pissed off ones. “But everyone loves my app!” you proclaim, blissfully unaware that there are users out there that really want to use it, but can’t.
The people I’m talking about are users of assistive technology, and specifically where it pertains to this post, the blind and visually impaired variety. There are millions of them in the US alone, they love technology, and they want your app to work for them as well as it does for everyone else.
There are many good resources out there that cover the technical details around how to improve the accessibility of apps (a few can be found here and here), this post is not that. Rather, it focusses on the problem from the user side, explaining how the technology works and how you can use it to find out whether your app is a model accessibility citizen, or an offender that needs improvement. It should be a helpful read to technical and non-technical product professionals alike.
Apple describes VoiceOver as:
VoiceOver is a gesture-based screen reader that lets you enjoy using iPhone even if you don’t see the screen. With VoiceOver enabled, just triple-click the Home button to access it wherever you are in iOS. Hear a description of everything happening on your screen, from battery level to who’s calling to which app your finger is on. You can also adjust the speaking rate and pitch to suit you.
If you’re not familiar with VoiceOver, you should know that it is a very different way of interacting with a touch device. Think about how much of a visual process interacting with a touch screen is, and it shouldn’t be surprising that for a blind person to perform those same interactions requires a very different type of interface. Instead of a belabored explanation of how VoiceOver works, you should just try it right now. After all, if you have an iPhone or iPad, you already have VoiceOver on your device, it’s simply a matter of turning it on.
Enabling VoiceOver is a snap, just do the following:
- First, open the Settings app from your iOS home screen.
- On the first screen tap “General”.
- Next find and tap “Accessibility”.
- Now tap “VoiceOver”.
- You’re now on the VoiceOver settings screen, turn it on by flipping the “VoiceOver” switch to the on position.
Great, now VoiceOver should be running. You’ll know for sure if you hear a synthesized voice speaking. Now that you’re in VoiceOver mode, there are a few basic gestures you should know about:
- Tapping once anywhere on the screen will move the VoiceOver cursor to that UI element, but notably, will not activate it.
- Once focussed on an element, double tapping anywhere on the screen will activate that element. For example, to tap a button, you must first tap once on it to focus the cursor, then double tap anywhere on screen to “tap” the button. This holds true for the vast majority of tappable elements including turning switches on and off, focussing text fields, or selecting rows in a table.
- Since blind users can’t see the tappable visual elements, VoiceOver includes an alternate navigation method. Swipe left or right anywhere on the screen to advance through all of the accessible elements.
If you want more help getting familiar with the above and other VoiceOver gestures, you can tap on “VoiceOver Practice” on that same VoiceOver settings screen to go deeper. Once you get the hang of it, start exploring other parts of iOS with VoiceOver enabled. All of Apple’s own apps have very good support for accessibility, so start with those apps (Mail, News, Phone, etc). Keep in mind that many VoiceOver users navigate via left/right swipe and can’t see the screen, so try navigating this way without looking at the screen.
Pro Tip: iOS has a very handy mechanism for quickly toggling accessibility features. To activate it, on the Accessibility settings screen (Settings > General > Accessibility), scroll all the way down and tap on “Accessibility Shortcut”. On this screen you can select which functions you’d like to enable for the shortcut, for now just choose VoiceOver. Now whenever you click the home button three times (devices with home button), or click the sleep/weak button three times (devices without home button, VoiceOver will be toggled on or off. This shortcut comes in very handy if you’re a sighted user that doesn’t normally use VoiceOver but you want to quickly turn it on to test an app.
Creating a test plan
Speaking of testing apps, it’s time to get into the most important part of this post: how do you figure out how well your app does for a VoiceOver user. You might think the right approach would be to turn on VoiceOver and just brute force the hell out of every screen, and while that may yield fruit, it’s far from the most effective way to approach this problem. If you come at this scattershot, you’ll end up dealing with issues that few users care about and missing much more important ones.
So in that vein, step number one is to determine what the most important flows in your app are, and more specifically, which ones you should be testing with VoiceOver. If you built your app using a product-focussed development approach, you may already have these flows documented. If that’s the case, then you’re already ahead of the curve, you have a list to start with. If not, you have a bit more upfront work to do. In any event, keep the following tips in mind as you create your test plan:
- Focus on the things a user would do every time they open the app. In an email app for example, it would be reasonable to expect a user to browse various lists of messages, view messages in their entirety, and compose new messages.
- Make sure to address flows that might happen less frequently, but deal with sensitive data or precise input. Example: subscribing to the premium version of an app is something a user would only do once, but it involves handing over money, so thus is something that has to work every single time.
- Pay close attention to large portions of your app that use custom UI. iOS provides a very rich set of “out of the box” UI components (see the Apple Human Interface Guidelines for an exhaustive look) that provide default support for VoiceOver and will work well with little to no extra effort. UI that strays from these components however, is less likely to work well, and should be prioritized in testing. Most of Apple’s own apps stick to stock components, so if you see something in an app that you’ve never seen in one of Apple’s own, it’s a good bet that it’s custom UI.
How to test
Now that you have a test plan in-hand, the only thing left is to enable VoiceOver and run through it. Turn VoiceOver on if it’s not already active, and dive into your app. As you test each feature, it’s a good idea to keep a running list of issues, a note taking app is fine for this or you can open tickets in your favorite issue tracker.
Below is a list of additional things to keep in mind as you test. Focus on these things, and you’ll get a good chunk of the way to a very accessible app.
By default, the VoiceOver cursor starts at the upper left whenever you navigate to a new screen. Most VoiceOver users navigate linearly, swiping left and right to move amongst the various on-screen elements. Because you’ve been conditioned for years to point and tap, you’ll likely tend towards this behavior, tapping on elements as if you weren’t using VoiceOver, then double tapping to enable them.
Resist this urge, and stick to linear navigation. It’s entirely possible to have a screen full of accessibility labeled elements, but because of one implementation detail or another, certain elements are stranded, only discoverable by VoiceOver if the user drags their finger around the screen or randomly taps. Many users will never find these elements, so while they might be accurately tagged, they are in fact useless in this context.
The most glaring accessibility problem that plagues most apps is lack of accurate element labeling. Any accessible element can have a label, only visible to VoiceOver, that will inform what the system announces when that element is activated. Many default iOS elements have sensible defaults that make this a non-issue (labels, text fields, buttons with text, etc.), but for many elements, the default behavior is insufficient.
The use of iconography for example, is a very frequent problem area. Buttons with icon-only content will be announced as the name of the image asset by default, not helpful in most cases. Image views, typically used to render images that are not controlled are likewise problematic if not labelled. As of iOS 11, the system will try to use image recognition in these cases, but it’s best not to leave things to chance, better to provide your own accurate labels.
The general rule here is, as you navigate from element to element, close your eyes and listen to the VoiceOver announcement. If the text that was spoken doesn’t accurately describe what you see enough that a blind person would be able to understand it, note that element for fixing.
Missing or unnecessary elements
As in the case of how to label events, the system has default behavior for which elements are treated as “accessible” (e.g. VoiceOver will land on those elements when navigating), and which ones are ignored entirely. This results in two possible issues.
Firstly, it is possible for elements to not be marked “accessible” and thus be ignored by VoiceOver. Often this ends up being the case for custom UI controls, views, or navigation elements. As you test, these issues will be painfully obvious, as they will likely present you from accomplishing something you can plainly see is possible, but not while VoiceOver is active.
Alternatively, it’s possible for elements to be visible to VoiceOver when they really shouldn’t be. Typically this comes in the form of UI decorations or adornments that are implemented as images. If a particular element doesn’t provide any value around the use-case for a given screen, it should not be marked as accessible for VoiceOver.
Transient or dynamic elements
Another class of UI elements typically found in apps are that of transient or dynamic elements. These elements typically appear on screen as a result of a set action for some period of time, disappearing when appropriate. Things like animated progress indicators and in-app notifications that appear at the top or bottom of the screen are great examples.
In some cases these elements are navigable by VoiceOver while on screen, however without some effort on the app developer’s part, VoiceOver won’t focus on them and so it’s likely a user will not notice they are present. As you test, take special note of any elements that appear and disappear dynamically, if VoiceOver doesn’t automatically address them in some way, they represent are issues that should be noted and fix.
Once a commonly overlooked aspect of user-experience, accessibility is quickly becoming critical if you want to create an app that reaches as many people as possible. The good news, is that the technology and tooling available to produce fantastic accessibility experiences is getting better and better with every new release of iOS.
Hopefully this post has made it easier for you to get a good picture of where your app stands when it comes to accessibility. If you have any questions or feedback, please drop it in the comments section or get me on Twitter: @nickbona.
Originally published at Velocity Raptor.