apples new eye tracking

Apple has just announced a slew of new accessibility features that will soon be coming to your iPhones and iPads. And the most remarkable of these new features is eye tracking. Apple has stated that this feature will use AI to let people with physical disabilities navigate through iOS and iPadOS more easily by just using simple eye gestures.

This accessibility feature is likely to debut in iOS and iPadOS 18. However, Apple is only saying that this will come to their devices “later this year.”

The eye tracking feature “uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on the device, and isn’t shared with Apple.” 

The company has stated that it has been designed to work across iOS and iPadOS apps without requiring any extra hardware or accessories.

And that’s not all.

The new music haptics will let those who are deaf or hard of hearing “experience music on their iPhone” by producing a range of vibrations, taps, and other effects in rhythm with millions of tracks on Apple Music. Apple says developers will also be able to add the feature to their own apps through a new API.

The company’s full press release has much more information on all the accessibility capabilities that will be coming to its devices in the next couple of months.

AI and machine learning appear throughout the text, offering yet more confirmation that iOS 18, iPadOS 18, and the company’s other software platforms will go heavy on AI-powered features. Apple is reportedly in discussions with both OpenAI and Google about collaborating on some generative AI functionality.

Related Posts
×