Why Hand Tracking Could Be the Best Feature of Apple’s AR/VR Headset


  • Apple’s upcoming AR device will use hand- and eye-tracking instead of requiring controllers.
  • Hand-tracking allows for much more intuitive, natural interactions.
  • Humans are already trained to point, wave, and so on.

Getty Images



Imagine not being able to touch or manipulate anything directly and instead having to interact with the world by pressing buttons and wiggling your thumb to control everything. That’s AR and VR without hand tracking.


Apple is working on adding hand and eye tracking to its upcoming augmented reality/virtual reality (AR/VR) headset, which will set it apart from existing headsets that require the user to grab a controller to interact with the virtual world as if they were using a games console. That’s not a problem for VR games, but when it comes to overlaying virtual data and objects onto the real world, it’s a huge hindrance. Not only that, but hand-tracking can do a lot that game controllers cannot—just like using your hands in the real world.


“Let’s say you are watching a movie on an HMD [head-mounted display], and you want to fast forward. It is a lot easier to do that with a flick of a wrist than looking around to figure out where to ‘click.’ Your remote that loves getting lost in the couch is now built into your hands,” Eric Alexander, founder and CEO of Soundscape VR, told Lifewire via email.



Table of Contents

Apple Advantage

Apple doesn’t tend to come up with entirely new products. It just does existing products better than anyone before. The portable computer was more of a luggable computer until Apple invented the clamshell laptop format we still use today. Likewise, there were touch-screen phones before the iPhone, tablets before the iPad, MP3 players before the iPod, and so on.


A hand allows users to interact with virtual environments using gestures that come naturally, making XR more immersive and engaging.

This is to say, any headset from Apple will be a huge leap over currently-available headsets, and part of this, according to Apple reporter Mark Gurman’s inside sources, will be the control system. The headset will have an array of external cameras to read the user’s hands as well as the outside world. It will also have internal sensors to track their eyes. 


Together, these should make for intuitive interactions based on familiar human actions like reaching, grabbing, and looking at things.


“To make XR [extended reality] feel seamless and real, the most intuitive way is to use our oldest and most intuitive controller, the human hand. A hand allows users to interact with virtual environments using gestures that come naturally, making XR more immersive and engaging. Users can perform tasks and manipulate objects in a way that feels familiar to them, including moving, rotating, and scaling. When users are able to accurately and easily manipulate virtual, matterless objects, it can significantly enhance the overall experience or interaction with computers,” Damir First, CCO and cofounder of AR company Matterless, told Lifewire via email.


One of Apple’s strengths is user interaction. Even today, Mac trackpads are still the best around, and the gesture-based touch interface (copied by pretty much every other phone maker) of the iPhone and iPad is phenomenal. Just look at anyone using a phone the next time you’re on the metro, and you’ll see that people don’t even think about their interactions. One imagines that any AR gesture-based interface will be similarly natural. This is the kind of thing Apple is very good at.


Luis Alvarez / Getty Images




Off Hand

But hand tracking can’t do everything. Just like you wouldn’t want to replace your car’s steering wheel with hand-wavy gestures, for gaming, sometimes nothing beats a dedicated game controller.


“Tactile buttons, triggers, and joysticks are incredibly useful for many actions where you need a fast response time, and haptic feedback from controllers is another element of immersion. With hand tracking, you lose all of that,” says Alexander.


Likewise, for typing. “This is useful for portability, but no serious typist is going to abandon the ergonomics of a traditional keyboard for typing in the air. Generally, hand tracking technologies are a trade-off; they offer increased portability,” says Alexander.


But that’s fine. We don’t have to give up any of that. Meanwhile, hand tracking can enable entirely new interactions. New for computers, anyway, but old for humans. Much of our communication is non-vocal. We might point, give a thumbs up, or communicate via sign language. Hand tracking could incorporate all of that into AR, so we don’t even have to learn anything new, which would make it way more accessible for everyone.



Source link

Previous articleUbuntu Pro is now available for everyone to use
Next articleClover integrates BTC lightning network with billion-$-company Strike