Eye Tracking allows you to navigate iOS 18 hands-free on your iPhone


Credits: Apple

Apple’s Eye Tracking technology introduces a new way for iPhone and iPad users to interact with their devices. Debuting in iOS 18, it highlights the powerful potential of this feature for many users. In May, Apple unveiled several accessibility features coming to iOS 18 and iPadOS 18, including eye tracking.

Eye Tracking is a key functionality in the Apple Vision Pro, used to track the user’s gaze for selecting screen elements, confirmed with a finger pinch. The iOS 18 version adapts this concept for smaller screens.

This Eye Tracking feature aims to assist users with physical disabilities who have difficulty interacting with a touchscreen. It enables users to navigate iOS or iPadOS using just their eyes.

Eye Tracking determines where the user is looking on the screen without extra equipment, utilizing the front-facing camera and on-device machine learning.

Set-up and Calibration

Setting up Eye Tracking is quick, taking about a minute. It can be accessed in the Settings app under Accessibility, then Physical and Motor, and selecting Eye Tracking.

Once enabled, users follow an on-screen dot with their eyes to complete the calibration. Afterward, a black dot appears on the screen, acting as a pointer that follows the user’s gaze, replacing the need for touch input.

Dwell Control and Customizing Control

With Eye Tracking enabled, Dwell Control is also activated. This feature selects items on the screen by maintaining a steady gaze for a few seconds. Dwell Control can be turned off or customized via Assistive Touch settings.

To further aid users, the Smoothing setting can adjust the pointer’s sensitivity to eye movements. This prevents the cursor from moving excessively with brief eye movements and helps users focus on their desired selection more accurately.

There’s also an option called Snap to Item, where the pointer gravitates toward the nearest selectable user interface element. This feature simplifies menu selections, reducing the need for precise accuracy.

A selectable dot can expand to reveal additional options, such as adjustments to Dwell Control, and access to the Notification Center and Control Center. This addition makes it easier to access key elements of iOS 18 using a more challenging control method.

Tricky but Useful

Although still in its early beta phase, the accuracy of the feature isn’t perfect, but it generally aligns with where I’m looking on the display. Occasionally, the cursor overshoots, but correcting it by adjusting my gaze isn’t too difficult.

The position of your head relative to the iPhone is important, as any movement from the calibrated position can affect gaze detection. Similarly, moving the iPhone can cause issues if it’s handheld.

For optimal use, position yourself about 1.5 to 2 feet away from the display during calibration. The iPhone should ideally be on a stand and kept stationary.

The concept shows promise for users with mobility issues. For others who typically use touch to interact with their iPhone, it may be less useful but remains an interesting idea.