Less pinching, more tapping, swiping, and typing.
Rudimentary hand tracking arrived for the Meta Quest in December of 2019. But Meta isn’t satisfied with its “abstract” pinch gesture—it wants you to touch objects in VR, similar to how you might touch a screen or keyboard in real life. Now, Meta is putting this idea to the test.
The Quest v50 software update, which is currently rolling out to users, offers an experimental “Direct Touch” hand-tracking feature. The idea is pretty simple; Direct Touch allows you to navigate Meta Quest apps and menus using traditional touch gestures.
So, instead of pinching at the air like a weird fleshy lobster, you can now “tap,” “swipe,” and “type” in the air. This makes it easier to scroll through browser pages or menus. Unfortunately, app support is extremely limited, so we’re not sure how apps or games will utilize Direct Touch.
There’s just one problem; touchscreens are already a bit loosey-goosey and require a decent amount of coordination. Direct Touch mimics touchscreen controls, but it cannot produce physical feedback—you aren’t actually touching anything, so your hand-eye coordination needs to be pretty spot on.
Direct Touch tries to address this conundrum by outlining your hands and highlighting UI elements that you “touch.” But as The Verge notes, the experience is a bit awkward and somewhat buggy. Also, unsurprisingly, your arms get tired pretty quickly.
You can enable Direct Touch from the Experimental Settings tab after updating to Quest v50. Note that this feature is only available on the Quest 2 and Quest Pro headsets.
Source: Meta