Owners of the Apple Vision Pro will soon have the option of scrolling through apps using their eyes, without lifting a finger.
The Apple Vision Pro primarily uses hand gestures as its control scheme, with users able to select items and make changes with a movement of the fingers. It seems Apple’s making it possible to do some things without needing gestures at all.
According to Bloomberg on Wednesday, Apple is working on a feature that builds upon the existing eye-tracking functionality of the Apple Vision Pro. Allegedly being tested for possible inclusion in visionOS 3, it will let users move around the app simply by looking around.
The Apple Vision Pro already uses eye-tracking to determine what a user is looking at, with a pinching hand gesture used to select what is being focused upon. This seems like it would be a fairly reasonable progression of the functionality, and could be a boon for users who don’t necessarily wish to keep raising and lowering their hands to interact with an app.
Report sources say that Apple will be making the functionality available across its own app collection. Developers will also be able to use the feature in their visionOS apps.
The Apple Vision Pro is not the only device with eye-tracking functions. In June 2024, Apple introduced eye-tracking features to iOS 18 and iPadOS 18 as an accessibility feature, using the front-facing camera.
In that iteration, Dwell Control automatically selects an item for a user once they have rested their gaze on a selectable element for a period of time. Smoothing and Snap-to-Item were also configurable to help with hands-free navigation.
The first opportunity for the feature’s appearance would be WWDC, with the keynote set to air on June 9.