As physical and digital worlds combine, human gestures become the new UI for the metaverse
For decades, screens have been a barrier between the digital and physical worlds. More recently, we’ve edged into new territory as browser and app-based camera applications have enabled pass-through AR experiences. The physical and the digital are merging, and we expect this trend will continue and likely accelerate. As we forge ahead, one key component of this new, so-called ‘phygital’ existence will be digital experiences that respond to human gestures.
Arm movements and voice commands will replace the click of a mouse or the tap of a screen. By combining gesture-based technology like body tracking and voice recognition with browser-based AR (WebAR) we can interact with digital content as naturally as we move through the real world.
As the metaverse takes shape, here are a few ways body-responsive AR is helping us get “phygital:”
Gesture Responsive WebAR
Thanks to the Geenee AR full-body and face tracking SDK, you can now use your body movements to prompt AR interactions. We first announced the technology by making it possible to wear 3D digital garments or become your avatar in real-time WebAR. Already we see potential applications taking shape for AR fashion and gaming.
Since the launch of the SDK we’ve introduced viral social experiences including a voxelized Snoop that mirrors the users’ movements, and a particle sphere that changes based on the users’ arm gestures.
AudioActive AR for the Metaverse
WebAR paired with sound-responsive technology is a powerful combination. To showcase this capability, we created an augmented reality pup that responds to voice commands, including “sit,” “shake,” “roll over,” and “play dead” (demo here).
Because we’re relying on Web Speech API to power voice recognition, the dog can easily be programmed to respond to multiple languages. Wanna build your own virtual pet? You’re only minutes away. Sign up for the Geenee WebAR Builder and follow the instructions we’ve outlined here.
Apply AudioActive AR to game-play, and let the user can control the avatar by voice. Other potential use-cases include interactive marketing campaigns and live events, featuring AR visuals that respond to music, sound-effects, or voice.
Geenee AR Pup in Action
Full-Facial Tracking for AR Avatars
Applying full-facial tracking to WebAR is good for more than social selfie filters. It also allows your avatar to mirror expressions and convey emotions in real-time. Or, appear as your NFT avatar in video chat for work or when catching up with friends, while still emoting as you naturally would.
Facial tracking adds an extra layer of realism to virtual learning environments. In classroom settings, this can boost accessibility by enabling hearing-impaired students who read lips to actively participate. Reflecting your human gestures in the metaverse can also help the instructor demonstrate a more authentic connection with the subject matter while they lecture.
“Phygital” Interactions Recorded in the Metaverse
Imagine if your “phygital” interactions could be stored on the blockchain. Perhaps the dance you choreographed with your NFT Snoop needs to be memorialized, or your AR dog learned a new trick. In a Web3 world, we may opt to keep a record of our human gestures in the metaverse. This can include everything from gameplay to generative artwork, by allowing users to effect real changes to the digital assets we hold in our wallets.
This allows for customization entirely unique to each NFT, including generative changes based on movements, voice or other action. When it comes to gesture-controlled AR, creative applications are only limited by the imagination. The metaverse is still loading, and unlocking a touchless user interface will be core to its development.
Like what you read? With Geenee, you can publish interactive augmented reality to the web within minutes. Sign up and start building today: