iPhone to soon let specially abled talk in their voice
Apple has announced new features for users with cognitive, visual, and hearing disabilities as part of its global accessibility awareness campaign. Some of the key features coming to iPhones include "Access Assist," "Personal Voice," and "Point and Speak on Magnifying Glass." Apple is also rolling out additional software features, select collections, and more for select regions. However, the company ensures that new tools are based on advances in hardware and software, including on-device machine learning to ensure user privacy.
The most significant feature is Personal Voice Advance Speech for users at risk of losing the ability to speak, such as those recently diagnosed with ALS (amyotrophic lateral sclerosis) or other conditions. The tool allows users to talk with their voice through the iPhone. In a blog post, Apple explains: "Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users' information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones."
In addition to Personal Voice, Apple is adding Live Speech to iPhone, iPad, and Mac to allow users to speak with a speech disability. Users can type what they want to say to be said aloud during FaceTime and phone calls, and in-person conversations. Assistive Access is designed for users with cognitive disabilities. The tool offers a personalized app experience by removing excess to help users select the most appropriate option. For example, for users who prefer to communicate visually, Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones. Trusted users and collaborators can also choose between a more visual, grid-based layout for their home screen and apps or a row-based design for users prefer text.
Assistive Access on iPhones and iPads offers a simple interface with high-contrast buttons and large text labels. For iPhones with LiDAR scanners, there will be a new Point and Speak in Magnifier to allow users with disabilities to interact with physical objects. Apple claims that Point and Speak combine input from the camera, LiDAR scanner, and on-device machine learning to announce the text on each button as users move their fingers across the keyboard. In addition to the new tools, Apple will launch SignTime in Germany, Italy, Spain and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters. In addition, select Apple Store locations worldwide offer information sessions throughout the week to help customers discover accessibility features.