Users Can Now Navigate iPads and iPhones Just by Their Eye Movement

Apple announced a range of accessibility features coming to iPhones and iPads later this year. But what caught our eye is Eye Tracking functionality on iPads.

This technology uses the front-facing camera to track eye movements and translate them into on-screen controls. After a simple calibration process, users can navigate apps, select items, and perform gestures solely with their eyes. 

Apple says that it prioritizes user privacy with Eye Tracking. All data processing occurs entirely on the device, which usually means that user information remains secure. The feature is said to work seamlessly across existing iPadOS and iOS apps, and no additional equipment is needed.

We believe deeply in the transformative power of innovation to enrich lives. That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.

said Tim Cook, Apple’s CEO.

Beyond Eye Tracking, Apple announced other accessibility features as well:

  • Music haptics: Designed for users who are deaf or hard of hearing, this feature uses the Taptic Engine in iPhones to translate music into vibrations.
  • Vocal shortcuts: Users can create custom voice commands to trigger specific actions on their devices.
  • Vehicle motion cues: The coolest out of the additional features. This feature can reduce, or at least aims to reduce motion sickness by displaying animated dots on the screen that respond to a vehicle’s movement.

Notably, visionOS will receive systemwide Live Captions. Meanwhile CarPlay is getting Voice Control, Color Filters, and Sound Recognition.

More here.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.