apple accessibility 1715855753806.jpg
apple accessibility 1715855753806.jpg

Apple Announces New AI and Machine Learning Eye Tracking Features and Music Haptics Accessibility

On Wednesday, Apple announced several new accessibility-focused features for its iPhone and iPad devices. The company regularly introduces new accessibility features to make the devices easier to use for people with physical disabilities. This year, the tech giant is introducing a new Eye Tracking feature that will allow users to control their device with just eye movements. In addition, Music Haptics will allow users to feel music through vibration, and Vocal Shortcuts will allow them to perform tasks with their own sounds.

The features were announced through a publication in the company’s editorial office. Sarah Herlinger, Apple’s senior director of global accessibility policy and initiatives, said: “Every year we open new avenues for accessibility. These new features will impact the lives of a wide range of users by providing new ways to communicate, control their devices and move around the world.”

First, the Eye Tracking feature offers users a built-in option to control their iPhone and iPad with just eye movements. Powered by artificial intelligence (AI), the feature uses a front-facing camera that can be calibrated with the user’s eyes, and built-in machine learning (ML) features to track the eyes so that people with disabilities can easily navigate the phone. The company says it does not have access to user data.

Music Haptics is another new feature that offers a unique way to enjoy music for hearing impaired users. This feature on iPhone uses the Taptic Engine to reproduce taps, textures and vibrations according to the sound of the music. Apple says this feature can play millions of songs in the Apple Music catalog. It will also be available as an API for developers to integrate it into their music apps.

Next, Vocal Shortcuts for iPhone and iPad users is designed to help people with speech disorders. It allows users to set their own expressions, understood by Siri, to launch shortcuts and perform tasks. Additionally, a new feature called Vehicle Motion Cues adds animated dots around the edges of the screen to reduce the sensory conflict between what a person sees and feels. Citing research, Apple said that this conflict is one of the main causes of motion sickness, and this feature can reduce such symptoms.

Apart from that, CarPlay also gets voice control, sound recognition and color filters to help users with various disabilities. Apple’s newest product line, the Vision Pro, also gets a system-wide live caption feature for the hearing impaired.


Affiliate links can be created automatically – see our ethics statement to learn more.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *