At the Designer of Things conference in San Francisco last week, I had the opportunity to chat with Chris Goodine, developer evangelist for Thalmic Labs, about the company's innovative wearable control device, Myo.
Myo is an armband that monitors electrical signals in the wearer's arm muscles and converts them into roll, pitch, and yaw (movement along the x, y, and z axes) to track motion in three dimensions. Thalmic Labs started shipping Myo to developers in July and plans to begin deliveries to other customers shortly.
Myo, like Apple's iPhone, Leap Motion, and Microsoft Kinect, represents an effort to redefine how we interact with computers in the mobile era. The mouse just doesn't work when you're on the move. Touch and voice interaction have filled the gap, but gestures have potential too, particularly when conditions inhibit touch or voice -- wet, noisy, or dirty environments, for example.
[Brace yourself. Wearables In The Workplace: 3 Realities.]
"We see this opening up a number of applications that just aren't possible with current technologies," said Goodine.
Yet the initial use cases being contemplated -- controlling slides during a presentation and controlling media on a television -- can be managed with current technologies. Presenters continue to make do with remote controls on stage.
Dealing with media being sent from a laptop or tablet to a TV presents a more complicated scenario. Myo's gesture tracking offers some advantages as a way to interact with a computing device that's across the room, but those advantages may not be compelling enough to displace traditional handheld remotes. Traditional remotes have their own advantages, like familiarity and months of battery life; Myo can run for two days on a charge, though it may last longer because it powers down when not in use.
"It's pretty natural," Goodine said. "The device itself can understand large movements through the motion sensor. And using the muscle sensor, it can actually understand hand positions."
At a developer-oriented presentation, Goodine used Myo to advance projected slides with a hand gesture.
During the presentation, attendees wrote a script in the Lua programming language to map gestures read by Myo to keyboard commands on a computer connected through Bluetooth. This allowed participants to play Tetris on the computer using hand movements.
The results were only partially satisfying. Unlike a mouse click or mouse movement, gestures can be ambiguous. Myo sometimes failed to register specific gestures, or I failed to move in the way Myo expected. Either way, there were times when Myo didn't get the message.
This is a problem Thalmic Labs will have to overcome to give Myo a shot at becoming a mass-market input device. An input device that's even occasionally unreliable will frustrate users.
There's an additional issue: Myo provides orientation and acceleration vector data, but not the position of the wearer's arm relative to other objects. It can't, for example, tell you whether you're pointing at a specific object. A system like Microsoft's Kinect or Google's Project Tango maps multiple objects in 3D space can, making it more suitable for a wider range of applications.
The major downside of systems like Kinect, said Goodine, is the lack of mobility. "Your digital content is coming everywhere you go, and if you want to have gesture control, which we think is one of the more natural, intuitive ways of interacting, [you need to be mobile]," he said.
Goodine says Myo has potential for heads-up displays, augmented and virtual reality systems, and environments with Internet-connected devices. Undoubtedly the device does have potential. Now it's just a matter of identifying what Myo does better than anything else.
The Internet of Things demands reliable connectivity, but standards remain up in the air. Here's how to kick your IoT strategy into high gear. Get the new IoT Goes Mobile issue of InformationWeek Tech Digest today. (Free registration required.)