A new method of gesture tracking relies on a smartwatch’s accelerometer

Kyle Wiggers
Digital Trends
carnegie melon smartwatch accelerometer gestures rsz
carnegie melon smartwatch accelerometer gestures rsz

Short of yelling a command, tapping a screen, rotating a dial, pressing a button, or flicking your wrist, there are not many ways to interact with the average smartwatch. Sure, you could gesticulate wildly until you managed to send a text message, but that is a method of subject failure, ridicule, and usually both. Luckily, scientists have discovered a novel means of measuring smartwatch gestures accurately and, perhaps more importantly, with the hardware they already contain — an accelerometer.

Researchers at Carnegie Mellon University managed to boost a motion-tracking accelerometer’s sample rate to 4kHz from the standard 100Hz, an alteration that dramatically increased its gesture-tracking precision. The next step, programming gestures, was comparatively trivial — the scientists settled on 18 unique hand, wrist, and arm motions that could perform actions like playing music on nearby speakers, switching on a computer, and more. One particularly impressive example turned on the lights in a room with a snap. Another switched TV channels with the wave of a hand.

More: Type on an invisible keyboard with the Gest motion-control glove

The researchers demonstrated more than just gesture tracking. The super-sensitive accelerometer was able to identify grasped objects — basically, whatever the wearer happened to be holding. The possibilities of the tech are practically boundless, the team said; a smartwatch application could provide guitar-tuning guidance based on vibrations from the hand turning the pegs, for instance. A recipe application could show a progress bar for how long the wearer should beat eggs. A paintball gun app could display the amount of remaining ammo.

Arguably even more impressive was the tech’s acoustic applications. Miniature structures programmed with unique sequences of vibrations — so-called “acoustic tags” — could deliver data. A driver’s license with an acoustic tag could pack a tag with biometric data read by the smartwatch’s accelerometer when it makes contact, the researchers said.

More: Google Glass meets Kinect in Ari, a gesture-recognition app for smartglasses

Unfortunately, the tech on display remains relegated to the lab, for now. No smartwatch manufacturer has implemented the sort of high-fidelity tracking on display. But Carnegie Melon’s research is not fundamentally dissimilar to Google’s Project Soli, a developer kit expected to be released in the coming months. Soli, too, tracks gestures with high precision — a concept smartwatch demonstrated at Google’s I/O developer conference in 2015 let wearers scroll through messages with a combination of flicks, taps, and wrist motions. But unlike the Carnegie Melon’s research, though, Soli performs gesture tracking with a tiny radar — a component which smartwatches on the market generally lack.

Getting manufacturers on board with high-precision gesture tracking is likely to be the near-term sticking point, but Google and others have made do in the interim. Version 1.4 of Android Wear, the Google’s smartwatch operating system, introduced wrist motions for selecting and closing applications. At Samsung’s Tizen Developer Conference in San Francisco two years ago, Samsung demonstrated smartwatch motion gestures that could control music playback, raise and lower a speaker’s volume, and control a video game character’s movement.