Apple is once again celebrating Global Accessibility Awareness Day (GAAD) by offering a preview of upcoming software features designed to enhance cognitive, vision, hearing, and mobility accessibility. These features, scheduled for release later this year, aim to empower individuals with disabilities and make it easier for them to interact with technology and the physical world.
Among the upcoming features, the Point and Speak feature will be a welcome addition for individuals with vision disabilities. This feature, joining People Detection, Door Detection, and Image Description in the Magnifier app, utilizes the LiDAR technology available on selected iPhone and iPad models. By leveraging input from the Camera app, LiDAR Scanner, and machine learning, Point and Speak announces the text on various buttons as users move their finger towards them. For example, while using a household appliance—such as a microwave—Point and Speak will announce the text on each button as users move their finger across the keypad.
Point and Speak may sound familiar to those who have used or heard of the recently released VizLens app, which also offers a similar functionality called 'Live Camera Interaction.' However, Point and Speak distinguishes itself by utilizing the LiDAR technology to more accurately determine the position of the user's finger relative to the object being pointed at. This improved accuracy could potentially provide a more precise and reliable experience for users, but we shall have to wait and see whether this translates into any meaningful difference in performance between Point and Speak and VizLens.
It is worth noting that Point and Speak currently focuses solely on text recognition and does not include the identification of graphical symbols typically found on appliance controls. Furthermore, it does not provide object identification by directly pointing at the objects themselves. It is also likely that Point and Speak may not be the optimal choice for use cases involving real-time reading of text on items like food packaging or for handling longer text passages.
Point and Speak will be available on iPhone and iPad devices with the LiDAR Scanner in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese, and Ukrainian.
As Apple continues to expand the capabilities of its Magnifier app, it raises the question of whether these advancements hint at potential inclusion in the highly anticipated mixed-reality headset that is expected to be announced next month at Apple's Worldwide Developer Conference (WWDC).
In addition to Point and Speak, Apple has revealed other improvements for individuals with low vision and VoiceOver users. Users with low vision will benefit from enhanced Text Size adjustment across various Mac apps, including Finder, Messages, Mail, Calendar, and Notes, allowing them to personalize their visual experience according to their preferences and requirements.
VoiceOver users will enjoy more natural and expressive Siri voices, even at higher rates of speech feedback. Furthermore, users will have the ability to customize the rate at which Siri communicates, with options ranging from 0.8x to 2x. This level of customization enables VoiceOver users to tailor the auditory experience to suit their individual needs and preferences.
Notably, there is no mention of any new features or enhancements specifically tailored for Braille users. Furthermore, the number of new features and enhancements for blind and low vision users appears to be fewer compared to previous previews. This raises the question of whether Apple has prioritized addressing longstanding issues rather than introducing new capabilities.
Other features previewed by Apple include Assistive Access, which provides a customized experience for various apps, including Phone, FaceTime, Messages, Camera, Photos, and Music, in order to lighten cognitive load. These apps feature high contrast buttons and large text labels for improved accessibility. The feature also offers tools for trusted supporters to personalize the experience according to the individual's needs. For instance, Messages includes an emoji-only keyboard and the ability to record video messages for users who prefer visual communication. Users and their supporters can opt for a visually-oriented, grid-based Home Screen layout or a text-focused, row-based layout depending on their preferences.
The new features also include innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. For example, Live Speech on iPhone, iPad, and Mac allows users to type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time.
Another speech accessibility feature is Personal Voice, a simple and secure way to create a voice that sounds like the user. Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
Apple says that "for users at risk of losing their ability to speak—such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability—Personal Voice is a simple and secure way to create a voice that sounds like them."
"At the end of the day, the most important thing is being able to communicate with friends and family," said Philip Green, board member and ALS advocate at the Team Gleason nonprofit who has experienced significant changes to his voice since receiving his ALS diagnosis in 2018. "If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world—and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary."
Personal Voice can be created using iPhone, iPad, and Mac with Apple silicon and will be available in English.
Currently, it is not anticipated that VoiceOver users will be able to utilize the Personal Voice feature to create a custom text-to-speech (TTS) voice that can be used by VoiceOver. This presents an intriguing area for potential development, and we encourage Apple to explore the possibilities in this regard.
- Deaf or hard of hearing users can pair Made for iPhone hearing devices directly to Mac and customize them for their hearing comfort.
- Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like do, due, and dew. Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.
- Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favorite games on iPhone and iPad.
- Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
More Apple Celebrations of Global Accessibility Awareness Day:
- SignTime will launch in Germany, Italy, Spain, and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters. The service is already available for customers in the U.S., Canada, U.K., France, Australia, and Japan.
- Select Apple Store locations around the world are offering informative sessions throughout the week to help customers discover accessibility features, and Apple Carnegie Library will feature a Today at Apple session with sign language performer and interpreter Justina Miles. And with group reservations — available year-round — Apple Store locations are a place where community groups can learn about accessibility features together.
- Shortcuts adds Remember This, which helps users with cognitive disabilities create a visual diary in Notes for easy reference and reflection.
- This week, Apple Podcasts will offer a collection of shows about the impact of accessible technology; the Apple TV app is featuring movies and series curated by notable storytellers from the disability community; Apple Books will spotlight Being Heumann: An Unrepentant Memoir of a Disability Rights Activist, the memoir by disability rights pioneer Judith Heumann; and Apple Music will feature cross-genre American Sign Language (ASL) music videos.
- This week in Apple Fitness+, trainer Jamie-Ray Hartshorne incorporates ASL while highlighting features available to users that are part of an ongoing effort to make fitness more accessible to all. Features include Audio Hints, which provide additional short descriptive verbal cues to support users who are blind or low vision, and Time to Walk and Time to Run episodes become “Time to Walk or Push” and “Time to Run or Push” for wheelchair users. Additionally, Fitness+ trainers incorporate ASL into every workout and meditation, all videos include closed captioning in six languages, and trainers demonstrate modifications in workouts so users at different levels can join in.
- The App Store will spotlight three disability community leaders — Aloysius Gan, Jordyn Zimmerman, and Bradley Heaven — each of whom will share their experiences as nonspeaking individuals and the transformative effects of augmentative and alternative communication (AAC) apps in their lives.
"At Apple, we've always believed that the best technology is technology built for everyone," said Tim Cook, Apple's CEO. "Today, we're excited to share incredible new features that build on our long history of making technology accessible so that everyone has the opportunity to create, communicate, and do what they love."
Apple has not provided any specific release dates for the upcoming features; however, they are anticipated to be included in the next versions of their operating systems, such as iOS 17, iPadOS 17, macOS 14, watchOS 10, and tvOS 17, which are expected to be launched this fall.
We'd love to hear your thoughts on the upcoming accessibility features from Apple! Feel free to share your opinions with us by leaving a comment below.