The Oculus Rift engages your head, but what engages your body? The trouble with such an immersive VR headset is that conventional input methods seem to fall flat. Who wants their eyes exploring virtual reality while your hands stay behind in the real world, tethered to the mouse and keyboard? It’s like going on a safari where you’re not allowed to leave the Jeep. In other words, no fun. At Iris, we continue to iterate through input paradigms until we find the perfect way to navigate a 3D space without assuming prior experience with 3D navigation, like video games. It’s been a challenge to find a device that is simple to use but complex enough to integrate into the full functionality of our software; however, we may have found a happy medium in the Myo armband by Thalmic Labs. The Myo fits around your lower arm just below your elbow, and uses electrical sensors to detect what your muscles are doing (and, by extension, what your fingers are doing). Check out a demonstration here.
The Myo can detect five gestures by default: a fist, fingers spread, a twist, and waving in both directions. In addition to gesture detection, it exposes full rotational and positional data using built-in accelerometers, allowing us to measure the angle that the users arm is at, and how fast it’s moving in any direction.
During a recent hack day in the Iris offices, we integrated the Myo into an architectural demo by creating a “hand” in VR that the user can use to move and rotate furniture. The fist gesture triggers a grab action that binds the grabbed furniture to the rotational and positional changes of the user’s arm until they let go. See the prototype in action here.
Since creating the furniture prototype, we’ve been brainstorming other ways to apply the Myo to our experiences, including writing our own gestures and using it as a full navigation tool. While there may not be a dedicated VR input solution yet, the Myo is a serious contender, and we’re looking forward to seeing how the possibilities grow as the Myo grows.