Apple announces some new assists, including eye-tracking, voice shortcuts and more
6 min readMay 16, 2024
Apple today announced new accessibility features that will be available later this year, including Eye Tracking, Music Haptics and Vocal Shortcuts.
- Eye Tracking requires no additional hardware, and all data is stored on the device for privacy and security.
- Music Tactile provides haptic feedback of music through the Taptic Engine.
- Voice Shortcuts allows users to create customized voice commands for easy launching of shortcuts and completion of tasks.
- Vehicle Motion Alerts help reduce motion sickness during car rides.
- CarPlay has been updated with voice controls and visual aids to enhance ease of use.
- visionOS and other assistive features further enhance the accessibility experience for Apple devices.
Key Highlights
- Eye Tracking: Driven by Artificial Intelligence, Eye Tracking provides users with a built-in option to navigate iPad and iPhone using only their eyes. Designed for physically challenged users, Eye Tracking utilizes the front-facing camera to set up and calibrate in seconds, and through machine learning on the device, all data used to set up and control the feature is stored securely on the device and not shared with Apple.
- Featured: Eye Tracking gives physically challenged users a new way to control their iPad or iPhone with their eyes.
- Technical Implementation: Utilizes the front-facing camera for setup and calibration, and ensures that the data is secure and not shared with Apple through on-device machine learning.
- Application scope: This feature is available for all iPadOS and iOS apps and requires no additional hardware or accessories. Users can activate app elements with Dwell Control, which allows access to additional features such as physical buttons, swipes, and other gestures using only their eyes.
- Music Tactile: Music Tactile is a new way for users who are deaf or hard of hearing to experience music on their iPhone. When this assistive feature is turned on, the haptic engine in iPhone plays taps, textures, and delicate vibrations in the audio of the music.Music Haptics is available for millions of songs in the Apple Music catalog and will be available as an API for developers to make music more accessible to their apps.
- Featured: Music Haptics offers hearing impaired users a new way to experience music through the iPhone’s Taptic Engine.
- Technical Implementation: Taptic Engine generates taps, textures, and vibration feedback during music playback.
- Application scope: This feature is available for millions of songs in Apple Music and will be available as an API for developers to add the tactile experience of music to their apps.
- Voice Shortcuts:With Voice Shortcuts, iPhone and iPad users can assign a customized voice that Siri understands to launch shortcuts and complete complex tasks.
- Featured: Users can create customized voice commands for Siri to understand to launch shortcuts and complete complex tasks.
- Technology Implementation: Recognizes a user’s speech patterns through on-device machine learning for users with cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke that affects speech.
- Application Scope: This feature builds on the functionality introduced in iOS 17 to provide a higher level of customization and control for users who are unable to vocalize or have limited speech.
- With Voice Shortcuts, iPhone and iPad users can specify a customized voice that Siri understands to launch shortcuts and complete complex tasks.
- Vehicle Motion Alerts: Vehicle Motion Alerts is a new experience for iPhone and iPad designed to help minimize the symptoms of motion sickness for car riders. Research has shown that motion sickness is often caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using their iPhone or iPad while riding in a car. with Vehicle Motion Alerts, animated dots along the edges of the screen represent changes in the motion of the vehicle, helping to minimize the sensory conflict without distracting from the main content. Using sensors built into iPhone and iPad, Vehicle Motion Alerts recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to display automatically on the iPhone or turned on and off through the Control Center.
- Featured: Vehicle Motion Alerts helps alleviate motion sickness for passengers using iPhone or iPad while in motion by displaying animated dots on the edge of the screen.
- Technical Implementation: Utilizes the device’s built-in sensors to recognize vehicle motion and animate the alerts accordingly.
- Application: Users can set this feature to automatically display or manually turn on in the control center.
- CarPlay Update: Upcoming CarPlay accessibility features include voice control, color filters and voice recognition. With Voice Control, users can navigate CarPlay and control apps using only their voice. With Sound Recognition, drivers or passengers who are hearing impaired or hard of hearing can turn on alerts to notify them of the sound of the car horn and siren. For colorblind users, color filters make the CarPlay interface visually easier to use, along with visual aids such as bold text and large text.
- Features: CarPlay adds voice control, color filters and sound recognition. The Sound Recognition feature notifies hearing impaired users of car horn and siren sounds.
- Technical Implementation:
- Voice Control: Users can navigate CarPlay and control apps using only their voice.
- Color Filter: helps color-blind users use the CarPlay interface more easily.
- Sound Recognition: provides notification of car horn and siren sounds for hearing impaired users.
- Applications: These features enhance convenience and safety while driving and riding in the car.
- visionOS Update
- Featured: visionOS will introduce system-level real-time captioning to help hearing-impaired users follow along with conversations and in-app audio.
- Technical implementation: Support for moving captions through window bars in Apple Immersive Video, as well as more Made for iPhone hearing devices and cochlear implant processors.
- Application scope: These features add to the accessibility experience of visionOS devices.
- Other Accessibility Updates
- VoiceOver: Adds the ability to add speech, flexible voice rotation, customized volume control, and customized shortcuts to the VoiceOver keyboard.
- Magnifier: Adds a new reading mode and the option to easily activate detection mode with an action button.
- Braille Users: Introduces quick controls and text editing for Braille screen input, with support for multi-line Braille and Japanese Braille input.
- Low Vision Users: The hover typing feature displays larger text with the option to select the user’s preferred font and color.
- Personal Voice: A Mandarin version will be available, allowing users to create a personal voice using short phrases.
- Live Speech: Includes categories and simultaneous compatibility with live subtitles.
- Virtual Touchpad: Enables users to control devices through a small area screen that acts as a resizable touchpad.
- Switch Control: Adds the option to use the iPhone and iPad cameras to recognize finger tap gestures as switches.
- Voice Control: Supports custom words and complex words.
.
- Personal Voice’ will offer a Mandarin version for users at risk of losing their ability to speak