The haptic gloves can apparently recreate the feelings of texture, pressure, and vibration when interacting with a digital environment – although the company says it is still in the “early stages of research”.
Meta’s eventual aim is to pair the gloves with a VR headset to simulate playing in a concert or poker game and may eventually work with augmented reality glasses.
Facebook has been working on its smart eyewear, called Project Aria, for some time but seems to have been delayed past its 2021 launch date.
“The value of hands to solving the interaction problem in AR and VR is immense,” said Meta Reality Labs research director Sean Keller.
“We use our hands to communicate with others, to learn about the world, and to take action within it. We can take advantage of a lifetime of motor learning if we can bring full hand presence into AR and VR.
“People could touch, feel, and manipulate virtual objects just like real objects — all without having to learn a new way of interacting with the world.”
To do this, Meta will need to combine auditory, visual, and haptic information to ‘trick’ the brain into believing the virtual world is real.
Meta says the gloves will eventually need to be “stylish, comfortable, affordable, durable, and fully customizable”. Practically, that challenge is difficult to overcome.
The gloves are made with hundreds of tiny actuators – tiny motors – that work synchronously but are also too big, expensive, and hot to work currently. Meta could theoretically replace these with soft ones that change shape as the wearer moves, but these do not exist yet.
The company is currently researching and developing the technology, with an emphasis on weight and speed, as well as creating the necessary software that could accurately simulate the physics of the real world.
“If I pick up a cube, I already have assumptions about the type of material it is and how heavy it might be,” Meta research scientist Jess Hartcher-O’Brien says.
“I grasp it, I verify the material, so I’m combining the visual cues about its material properties and the haptic feedback that’s coming just from that first moment of impact. When I go to manipulate the object, my brain recognizes frictional forces and inertia and can work out how dense or heavy this object is. My visual system is updating based on how it sees my arm move. Proprioception tells me where my arm is in space, how quickly it’s moving, and what my muscles are doing.”
This is also where technology such as hand-tracking comes in – something already built into the Oculus headsets – to deliver information to the correct area. In the future, it is possible that Meta could render a ‘haptic click’ with virtual buttons or ‘haptic emoji handshakes’ for meeting people users know in the metaverse.
Meta CEO Mark Zuckerberg has consistently pushed the metaverse as the future of Facebook, especially in light of a number of scandals about the harms that the app and others like it such as Instagram.
Mr Zuckerberg has said that an “embodied internet” will be focused on “engag[ing] more naturally” with the behaviours we already exhibit – such as reaching for our smartphones immediately upon waking up.
Whether or not the company can effectively manage the metaverse – or at least, its part of it – remains to be seen. In a leaked memo Andrew Bosworth, Meta’s CTO, has said that the company’s products should have “almost Disney levels of safety” but that virtual reality can often be a “toxic environment” and may push “mainstream customers from the medium entirely”.
However, he also said that moderating users’ speech and behaviour “at any meaningful scale is practically impossible”.
Register for free to continue reading
Registration is a free and easy way to support our truly independent journalism
By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists
Already have an account? sign in
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies