Apple have extensive guidelines on best practices for supporting assistive technologies in iOS apps but this is my cheat sheet.
When designing your app, make sure you review your designs in non-ideal conditions. Take your designs and and review them outside or in brightly lit environment.
Go into Settings > General > Accessibility and try using your app (or prototype) with various assistive features turned on.
Using standard UIKit elements you get a lot for free, so you might be surprised how much of your app is accessible without even having to do anything at all.
If you have a button that uses an icon instead of a text label, text burnt into an image or have built your own custom component then you’ll need to provide at least a label (and maybe a hint).
Labels are the same as an ‘alt’ tag in HTML. A hint is similar but is only spoken if the person keeps focus on the element without performing an action and is a useful way to provide additional details.
Read more: Apple.com Crafting Useful Labels and Hints.
Your app displays the charters “10/4”. What do these mean? the 10th April? the 4th October? the formula 10 ÷ 2, or the result 2.5?
With sight, we can quickly interpret formatted data from context but VoiceOver can struggle in some situations and will end up reading the charters verbatim: “ten slash four”.
You can avoid this by providing a label with less ambiguous text.
NSFormatter and its subclasses provide a bunch of methods that can help with creating appropriate labels for dates, times and numbers.
Read more: NSFormatter on NSHipster.
If a table cell has many components, people need to tap on each element to have the label read aloud. For cells with a lot of components this can not only be annoying, but for small elements can be difficult to hit the right target area.
Instead, a label (and hint) can be added to the entire table cell that speaks for the entire content of the cell.
Because table cells (and collection cells) are the structural foundation of most apps so this simple addition could make your app a lot easier to use for people relying on VoiceOver.
There is no native checkbox component in UIKit so developers make do with using the selected state of buttons or table cells to simulate this behavior.
Make sure that when VoiceOver is enabled that it accurately reports the state of your element. If it doesn’t, then you can add a trait to the element (in much the same way a label or hint is added) so that the selected state is spoken aloud when the member is using it.
Accessibility isn’t about checking off requirements, and an app that meets a checklist of requirements may not actually be any easier for people to use.
For example, focusing on the W3C WAI guidelines for an iOS app may divert resources that would have been better spent on implementing native features that iOS offers3.
If you’re really serious about accessibility, then the best approach would be to let people who use assistive technologies try it out for themselves. Search for a local meetup or community group and see if they can help give you feedback, and don’t forget to ask them to show you their favorite apps!
Even when you might not expect it would be possible: With voiceover enabled, the camera app will tell you when a face is detected in the frame to help compose pictures. The photos app reads metadata about the photo (“bright photo of person taken on …”) ↩
To be fair to Google, the Android team also provide a lot of good tools, but Apple is still miles ahead. ↩
No disrespect to the W3C who have long promoted accessible technologies, but many of the standards don’t necessarily make sense for iOS. For example WAI text-contrast guidelines were developed for HTML that will be presented on hardware of unknown specifications (different screen calibration, color profile) and with an expectation of screens that show an equivalent of 72dpi. Retina iPhones on the other hand have 326dpi and we can expect consistency across hardware. I reached out to the Apple Accessibility Division for their opinion, and their recommendations for contrast are to check that there is at least a 50% difference between text foreground and background when converted to greyscale. ↩