For iOS Developers: Taking Your Accessibility from Good to Great

Member of the AppleVis Blog Team

What does it mean to make a truly accessible app? How can you go beyond meeting minimum accessibility standards and make something that VoiceOver users will find intuitive and enjoy using? In this post, I want to try to answer those questions from a user's perspective. I will not be covering the code you should use; instead, I am trying to describe what makes an app easy to use for me, as a VoiceOver user, and I encourage members of the community to add their own views in the comments. Please note that I am assuming you already understand the basics: why accessibility is important, what VoiceOver is, and some of the most common gestures for navigating around the screen. If you do not already know this, please refer to Apple's iOS accessibility page. My focus here is on iOS, but some of the general principles will apply to other platforms. This post is almost exclusively about accessibility for blind users, but if you are a user with another disability, or with some remaining vision, I encourage you to leave a comment explaining what developers can do to make their apps easier to use for you.

How to Test Your App with VoiceOver

You probably already know that, to test accessibility, you’ll need to turn on VoiceOver and navigate around your app. But to make your experience even closer to that of a blind user, after turning on Voiceover, turn on screen curtain with a three-finger triple-tap. Now, you won't be able to see the screen, so you'll need to rely on what VoiceOver is telling you. You can then notice how the user experience compares with your normal, visual use of the app. Can you find everything easily? Is it obvious what all the controls do?

Stick to Standards

Some developers want to invent an entirely new set of gestures or controls for their app, or even make it self-voicing. If you do this, your app won’t be accessible to all users because if your text is not spoken by VoiceOver, it won’t be sent to a Braille display, so deaf-blind users, and those who prefer Braille for any other reason, won’t have access to your app. It will also usually make the app unnecessarily difficult to use, as your users will need to remember the gestures for your app as well as VoiceOver's gestures. If you use the same standards as other app developers, with controls that will be familiar to users from other apps, such as standard back buttons in the usual place, your app will have far less of a learning curve.

In most cases, you’ll be able to make an interface that’s both accessible and visually appealing, easy to use for blind and sighted users alike, so there’s no need to create a special accessibility mode. But if, for any reason, you need to design your app in such a way that the experience for VoiceOver users will be significantly different than for sighted users, you’ll make it easier to learn if this is explained in your documentation. Audio editors Hokusai and Ferrite both have editing controls that appear only when VoiceOver is turned on, but the manual for each has a chapter on using the app with VoiceOver.

Use Concise but Informative Labels

If you've read anything about VoiceOver accessibility, you'll already know how important it is to label your buttons and other controls. But what makes a good label? The best labels state exactly what the control does, giving as much information as possible in as few words as possible. Consider using “New document” rather than “Create a new document”, or “options” instead of “open options menu”. Shorter labels increase productivity for Braille users, who have very limited space on their displays. Verbose or unclear labels may also make your app difficult to use for people with cognitive disabilities.

It’s more important for your labels to say what the control does than to give a physical description of it; it doesn't help me to know that my VoiceOver cursor is on a black circle, if I don't know what will happen if I tap on it.

Weather Gods, which won a well-deserved AppleVis Golden Apple Award for best iOS app in 2017, provides a very good example of how labels can provide a lot of information. As the user touches or flicks through the controls of this app, VoiceOver reports detailed information for the day or type of weather associated with that control, and the information is presented in a sensible order, with the most important information first. Order matters because not everyone will always want to hear all the information you give them. If you have VoiceOver speak the essentials first, then the user can choose whether to listen to only that, and move on to the next control, or listen to the whole label until it is finished. What if you can’t decide on the best order for your information? The RSS feed reader Lire provides one possible solution: it has an accessibility setting allowing the user to change what information is spoken for each article in a list, and in what order it is spoken.

As an example of poor button labelling, In the Kindle app, the control that, before the last major update, was quite sensibly called "return to book", is now labelled "book actions menu, exit button". This is confusing because it would be quite easy to hear "book actions menu" and think it will open a new menu, or hear "exit button" and think it will close the book. Compared to the old label, it has more words but less information. The label "Return to book" tells me exactly where I’ll be when I tap on it, but "book actions menu, exit button" doesn't tell me whether I will land in the book or in my library.

Take Advantage of the Rotor

IOS allows developers to add rotor actions to controls, so that VoiceOver users can flick up and down to select the action to be performed. When used well, this can make app navigation very efficient. When a message is focus in Mail, for example, flicking up and down moves through a menu of options including deleting, flagging or archiving the current email.

If you make good use of the rotor, your users will spend less time flicking, tapping and finding controls. The Dropbox app used to have rotor actions for options such as moving or deleting files. In the latest update, the rotor actions have been removed and these tasks are now performed via a button next to each file labelled "actions file actions". The problem with this, aside from the poor button labelling (we don't need to hear the word 'actions' twice), is that it takes twice as many flicks to move through a long list of files.

Be Aware of Direct Touch

In some cases, you might want part of the screen to respond in the same way it would if VoiceOver was turned off, so that VoiceOver users can use the same gestures as their sighted counterparts, such as tapping something once and having it instantly activated. This would be useful, for instance, in a music creation app. When a musical keyboard is on screen, your users will want to tap on one of the keys and have the note played instantly. For this, you can enable direct touch on that part of the screen. Be careful with this feature, though, because it disables all of the normal VoiceOver gestures. Using direct touch on most or all of the screen without good reason will be confusing at best, and may make the app appear totally inaccessible at worst, as it may seem to be unresponsive.

The Magic Tap

VoiceOver users have access to a two finger double-tap gesture, known as the magic tap. What this gesture does depends on the current context, but it usually starts or stops something, such as pausing or resuming audio playback, or answering or ending a call. As a developer, you can specify what the magic tap will do within your app. The best way to use this gesture will depend on the purpose of your app, but it might start or stop recording, take a picture, report important status information, post an update to a social network, or whatever your users are most likely to want to access quickly.


I have tried to describe a few steps you can take to make your iOS apps even more accessible. Of course, different users have different needs and preferences and so may disagree on some of the details, and I encourage VoiceOver users reading this to add their own tips and suggestions in the comments. As mentioned, one way to accommodate people’s differing preferences is to have accessibility specific settings within the app. But if you’re not sure how to tackle a particular accessibility challenge, please come and join us on AppleVis, and you’ll find a community of users who will be happy to test your app and provide feedback.

Blog Tags: 


Good article, but ...

This is a good article that I agree with for the most part, especially on the parts that deal with the magic tap, rotor actions and having concise labels for buttons. I will add to this that if additional information for a button is required, VoiceOver hints can be used effectively. Below is an explanation of this from the Apple Developer website:

The hint attribute describes the results of performing an action on a control or view. You should provide a hint only when the results of an action are not obvious from the element’s label.

For example, if you provide a Play button in your application, the context in which the button appears should make it easy for users to understand what happens when they tap it. However, if you allow users to play a song by tapping the song title in a list, you might want to provide a hint that describes this result. The reason is that the label of the list item describes the item itself (in this case, the song title), not what happens when a user taps it.

However, I do take issue with the following paragraph:

Some developers want to invent an entirely new set of gestures or controls for their app, or even make it self-voicing. If you do this, your app won’t be accessible to all users because if your text is not spoken by VoiceOver, it won’t be sent to a Braille display, so deaf-blind users, and those who prefer Braille for any other reason, won’t have access to your app.

Actually, regardless of whether you use either VoiceOver to provide speech output or use the TTS in the iPhone, the text will be sent to a braille display. I tested this in iOS 11.2.1. I do agree, however, that there are severe limitations of this method for braille display users, but it still has its advantages as you will read below.

It will also usually make the app unnecessarily difficult to use, as your users will need to remember the gestures for your app as well as VoiceOver's gestures.

I think "unnecessarily difficult" is an exaggeration. As long as the gestures are intuitive, they should be easy enough to pick up. (A good way to test intuitiveness is to ask for beta testers.) In fact, many people learn VoiceOver gestures through games that employ direct touch as . Direct touch also offers advantages in many situations not described in the article such as extremely detailed audio games and some typing apps.

Finally, I would like to point developers to that also discusses creating very accessible apps.

testing testing 1 2

I agree with your post. I think a dev should try & simulate being blind.

But I also know that they can only do so much with that. I believe in having people with disabilities test your app. And giving them a way to give feedback & respond to it. Weather Gods does a very excellent job with this. I think that devs should be willing to be respontive with feedback. I also think that an app is more accessible when that's taken into account right off the bat. Again, Weather Gods is a fabulous example of this. I think that if an app starts out inaccessible & then has it built in, it can be a pain point. I know that depends on various factors, and ever? app is different. I'm sorry if I didn't make any sense, but that's the best I can explain it.