Australian technology news, reviews, and guides to help you
Australian technology news, reviews, and guides to help you

Apple set to bring eye tracking control to iPad, iPhone

The iPad isn’t named for the eyes, but later this year, that’s exactly how you’ll be able to control an iPad or even an iPhone.

Even though tablets are made to be touched, prodded, and generally used with the fingers, not everyone can use that gadget with their hands. Some people can’t, and so as part of Global Accessibility Awareness Day recently, Apple has been talking up what’s coming to the iPad, and it could be interesting for everyone.

Specifically, it’s about how you should soon be able to control a phone or tablet using only your eyes. And that’s not all.

As part of Apple’s ever-evolving efforts in accessibility, the company plans to roll out several ways to improve how people use iOS and iPadOS that won’t necessarily involve touch, even if the devices were initially built around touch.

Eye-tracking will use AI via on-device machine learning to watch how long an eye holds gaze to elements, with “Dwell Control” allowing on-screen elements to be triggered simply by looking at them. Apple says this will also extend to swipes and gestures, with physical buttons also able to be triggered using eye control in the software.

Also coming is something called “Vocal Shortcuts”, a concept that will let Siri understand sounds being uttered to launch apps and features. The feature has been designed for people with conditions that affect speeds, including cerebral palsy and stroke, and will use AI to help improve how people can use their phone and tablet.

For those who can’t hear, Apple will support a feature called “Music Haptics”, which will play taps, vibrating textures, and refined vibrations to match music played through Apple Music. We’re not quite sure how this will roll out on individual tracks, but Apple says it will work across the millions of songs available on Apple Music, and an API will be made available for developers to use the technology in their apps.

These are just some of the ways Apple will be improving accessibility features, with upgrades to Braille Screen Input, as well as Hover Typing showing larger text when typing in text, among others.

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Senior Director of Global Accessibility Policy and Initiatives at Apple.

“These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world,” she said.

The announcement also covers using phones in cars, but not necessarily for greater accessibility. Rather, it’s to help reduce motion sickness, with a new feature on the way that will show animated dots on the screen overlaid corresponding to the motion of the vehicle.

Using the sensors inside an iPhone and iPad, Apple’s Vehicle Motion Cues will animate dots moving left, right, and forward to help reduce sensory conflict, and potentially cut back on motion sickness for owners.

Many of these are on the way for later this year, and while there’s no date for release, we suspect we’ll hear more about these and other announcements at WWDC just around the corner.

Read next