The AR/VR world never has a dull moment and thanks to the gesture recognition technology of the Swedish computer vision startup ManoMotion, we may get to see our actual hands interact with 3D objects through our smartphone cameras.
Using ManoMotion’s recently released (June 1st, 2017) software development kit (SDK), Developers now have the option of adding hand tracking to their applications. This will allow users to swipe, click, tap, and grab objects in 3D space with “real-life intuitiveness.” The best part: no extra hardware needed.
The Best AR/VR/MR Reality Glasses have one thing in common: they allow users to engage with digital content in the most immersive way imaginable. From the Microsoft HoloLens to Atheer AiR Glasses, the industry’s potential is limited only by a developer’s imagination. Capable of tracking the 27 degrees of freedom (DOF) in our hand movements, ManoMotion’s patent-pending technology is now available developers by simply downloading the SDK from the company’s website.
Integrating ManoMotion’s intuitive hand-gesture tracking technology into VR, AR, MR, applications or IoT products would result in:
1. People seeing their actual hands and move objects in VR/AR/MR space
2. Objects being manipulated with the right hand or the left hand
3. The movement or the pose of the hand being determined
4. Dynamic gestures, such as swipes and clicks, being understood for the manipulation of menus, displays, etc.
5. A set of predefined gestures, such as point, push, pinch, swipe and grab, being accessed and utilized.
According to the co-founder and CEO Daniel Carlman, ManoMotion’s entire mission is based on finding out how they can make technology more interactive. “We can understand the dynamic gesture,” explains Carlman. “How much you’re grabbing or pushing something. And depth.” Accessing a camera already installed on smartphone, laptop, or tablet, the ManoMotion tracks bare hands in real-time AR/VR apps without the need for a controller.
Social Media Integration
With the hopes of integrating its technology with soon-to-be augmented reality platforms on Facebook and Snapchat, ManoMotion hopes to spearhead the gesture-based, controller-free movement alongside social media giants. The hand tracking technology is already low-latency (less than 10 ms lag on iOS/more than 17 ms on a Galaxy S6) and the newer the smartphone the device, the lower the latency.
So what makes ManoMotion’s SDK so attractive to developers? For one, the gesture-based technology doesn’t require extra hardware or rely on third party integration. Secondly, the hand tracking has a limited footprint – so it doesn’t take up a ton of space on CPU’s, doesn’t compromise memory, or consume a ton of battery power.
Described perfectly in a recent New Atlas article, “to be completely intuitive, tracking tech needs to be so small that it gets out of the way and allows human instinct to roam uninhibited.”
ManoMotion is a computer vision-based software company founded in 2015. Based in Stockholm, Sweden, and with a sales and marketing office in Palo Alto, California, ManoMotion’s long-term vision is to bring unparalleled intuition to human-machine interactions using gesture technology. They have developed a core technology framework to achieve precise hand tracking and gesture recognition in 3D-space simply via a 2-D camera – available on any smart device!