What feels trite to say year after year after year is also true.
Many of the new functionalities Apple announced this week at the company’s annual WWDC keynote have serious ramifications for accessibility. Study Apple carefully long enough and it’s not hard to understand why; not only is this a reflection of their institutional commitment to the disability community, it also underscores the idea that accessibility, conceptually and pragmatically, is not a domain solely for disabled people. Although accessibility software should (and always will) prioritize people with disabilities first and foremost, you needn’t have a disability to reap benefits from larger text on your iPhone. Accessibility is inclusive of everyone, regardless of ability.
Following last month’s unveiling of new discrete accessibility features, Apple on Monday showed off a slew of mainstream, marquee features spanning Apple’s five operating systems—iOS, iPadOS, watchOS, macOS, and tvOS—that are eminently useful as de-facto accessibility features, whether you’re disabled or not.
Live Text in Photos. Apple describes Live Text as a feature that “intelligently unlocks rich and useful information in images” like a phone number or email address that you can act on straight away. It’s yet another example of leveraging artificial intelligence and machine learning, paired with data detection technology, to augment reality in truly helpful ways. One way is, as ever, accessibility.
There are two prime benefits here. First, it’s often difficult for someone who’s Blind or low vision to read a sign or handwritten note in a photo, even zoomed in all the way. There’s enough pixellation (in addition to small font or illegible writing) in the photo that it makes parsing a number or email address difficult if not impossible. Hence, the grunt work Live Text does eliminates that visual friction; what’s more, the text can be read by VoiceOver or Dynamic Type since Live Text works at a system level, on device.
Secondly, the contextual menu to call or email someone, for example, is a boon in terms of reducing cognitive load. A person with atypical cognition, for instance, doesn’t need to copy the phone number, find the Phone app, paste the number, and hit Dial. All they need to do is tap Call right from the photo and the system initiates the call. More than immediacy or convenience, Live Text’s data detectors consolidates a multi-step process into a single step. All told, the concept is almost exactly the value proposition of shortcuts in iOS.
Conversation Boost in AirPods. One of the many new features slated for AirPods later this year, Conversation Boost is a close cousin of the Live Listen feature that debuted a few years ago. Where Live Listen helps direct sound in environments like restaurants and lecture halls, Conversation Boost amplifies the voice of someone talking to you. Apple says the feature was designed “for people with hearing challenges” but of course it can benefit anyone.
Apple is rightfully treading carefully with the hearing augmentation features in AirPods—their usual disclaimer is consult an audiologist and get a bonafide hearing aid if need be—but these features are nonetheless impressive. AirPods essentially are computers for your ears that optionally give you “bionic ears” with just a tap.
Digital Keys in Wallet. Later this year, Apple will add support for digital identification cards and house keys in the Wallet app. The big picture idea is to have your iPhone act as your wallet and key(s), meaning it obviates the need to carry around a physical wallet and/or keychain.
From an accessibility perspective, this enhancement is notable because of the friction associated with finding and using said items. To wit, a disabled person waiting to pass through security at the airport has to fumble around their bag or pocket for their ID; this may be cumbersome depending on the level of their fine-motor skills. (Not to mention cognitively when it comes to remembering where you last put it.) The same goes for house keys—having to reach into a bag or pocket for your keys, find the right one, and successfully insert it into the lock can be the motor equivalent of climbing Kilimanjaro. Both scenarios are examples of seemingly mundane tasks that most take for granted, when the reality is they are far from mindless or easy for many people.
Thus, digital representations of IDs and keys are not only more modern, they can be infinitely more accessible. TSA checkpoints can be extremely stressful for disabled people as it is—I speak from experience—so having to just wave your phone at the official is much better than hurriedly and clumsily trying to find your ID in your wallet or bag. And the experience is even more accessible if you have an Apple Watch.
Login with Biometrics on tvOS. New in tvOS this year is the ability to log into streaming apps with Face ID or Touch ID. This should be much more accessible (and convenient) than having to visit some website or scan a QR code. Even for an abled person, there’s a lot of cognitive load associated with signing into Netflix or other service when setting up a new or restored Apple TV, not to mention the motor movement of typing or pasting in passwords. The fact you can authenticate using your iPhone is smart—and long overdue.
Spatial Audio in Apple Music. While not an accessibility feature per se, it’s worth mentioning there is an accessibility component to it.
Many users with some degree of hearing loss, myself included, most likely literally won’t be able to appreciate the fullness of the sound Spatial Audio is supposed to deliver. Likewise, the feature is a no-go for those with sensory conditions associated with sound and/or vibrations.
This is nothing for Apple to accommodate for—you can disable Spatial Audio in Settings and not use it—but this nonetheless merited mention.