A product’s success depends on the degree to which it can touch people’s lives. With each shiny new Apple product comes a host of accessibility features that are worth discussing on International Day of Persons with Disabilities.
In fact, there is a new powerful short film (plays out to the soundtrack I Am the Greatest by Spinifex Gum with Marliya Choir; uses lyrics from the speeches of Muhammad Ali) from Apple that puts into perspectived the unlimited potential of accessible technologies on any product from the company. The features are presented through seven people with disabilities who go about celebrating life with a little help from technology like iPhone’s AssistiveTouch and Door Detection, the Apple Watches’ Sound Recognition notifications, and Speak Selection for iPad users, crushing barriers with confidence. Let’s look at a few accessibility solutions from Apple.
The Accessibility Menu
The iOS accessibility menu has four categories that can be seen under Settings > Accessibility — Vision (features that lets you see things better or describe what’s on your screen), Physical and Motor (features that allows you to control the iPhone using your voice or physical movements), Hearing (features that can help you hear your iPhone better and create a visual aid to accompany sounds) and General (contains miscellaneous features).
Vision accessibility features
VoiceOver on iPhone: Enabling the feature will make your phone read out loud any text selection you tap on your screen (or a description thereof, where text may not be applicable) and add touch gestures for certain functions. Through the VoiceOver setting screen you can connect and configure an external Braille display. This year, VoiceOver screen reader has been improved to support over 20 more languages, including Bengali, Bulgarian, Catalan, Ukrainian and Vietnamese.
Door Detection: The Magnifier app, Apple’s digital magnifying glass for enlarging objects in the camera’s view, has been upgraded in iOS 16. There are new functions designed to help people who are blind or low vision use their iPhones to detect doors and people nearby, as well as identify and describe objects and surroundings. Door Detection can help users locate a door upon arriving at a new destination, understand how far they are from it, and describe door attributes, like if it is open or closed, and when it’s closed, whether it can be opened by pushing, turning a knob, or pulling a handle. It can also read signs and symbols around the door, like the room number at an office, or the presence of an accessible entrance symbol. This feature combines the power of LiDAR, camera, and on-device machine learning.
Zoom: Magnifies the entire screen or separate windows. In the short video we see it being used with Logic on Mac.
Mobility accessibility features
AssistiveTouch on iPhone: Helps users with upper body limb differences control their devices. On iPhone and iPad, AssistiveTouch shows an onscreen button to help adjust volume, take a screenshot, lock your device or simulate shaking the device without having to touch a physical button. It also allows users control their Apple Watch with gestures like a pinch or clench, without having to tap the screen.
Voice Control on iPhone: The feature lets users speak voice commands to navigate and control their devices. Those with physical and motor disabilities can also use facial expressions to control their iPhone, iPad or Mac.
Alternate pointer controls on Mac, including head tracking and facial expressions: Facial expressions like sticking out your tongue, raising your eyebrows, scrunching nose, blinking can simulate pointer actions, like clicking with a mouse or trackpad.
Hearing accessibility features
Sound Recognition on iPhone, with an Apple Watch notification: The accessibility feature allows your device to listen for certain important sounds, sending you a notification if it hears one. The list can include alarms (fire, siren, smoke), animals (cat, dog), household (appliances, car horn, doorbell, door knock, glass breaking, kettle, water running) and people (baby crying, coughing, shouting).
Cognitive accessibility features
Spoken Content on iPad: Hear the iPad speak the screen, selected text, and typing feedback. The feature gives the device the ability to read out words on a page, read out the whole page, and more. The feature is also there on the iPhone.