TapType: Track AI-assisted hand gestures with accelerometers only

The team at the Sensing, Interaction & Perception Lab at ETH Zurich, Switzerland has come up with TapType, an interesting text input method that relies purely on a pair of wrist-worn devices, which detect acceleration values ​​when the wearer types on an old surface. By feeding the acceleration values ​​from a pair of sensors on each wrist into a neural network of the Bayesian inference classification type, which in turn feeds a traditional probabilistic language model (predictive text, for you and me), the resulting text can be entered with up to 19 WPM with an average error of 0.6%. Expert TapTypers report speeds up to 25 WPM, which could be quite useful.

Details are a bit sparse (it’s a research project after all) but the actual hardware seems simple enough, based on the Dialog DA14695, a nice Cortex M33 based Bluetooth Low Energy SoC. This is an interesting device in its own right, containing a “sensor node controller” block, capable of processing sensor devices connected to its interfaces independently of the main CPU. The sensor device used is the Bosch BMA456 3-axis accelerometer, which stands out for its low current consumption of only 150 μA.

Users can “type” on any suitable surface.

The wristband units themselves appear to be a combination of a main PCB that hosts the BLE chip and a support circuit, connected to a flexible PCB with a pair of accelerometers on each end. The assembly was then slid into a flexible wristband, probably made from 3D-printed TPU, but we’re only really guessing, as the progress from the first embedded platform to the wearable prototype is unclear.

What is clear is that the wristband itself is just a dumb data streaming device and all the smart processing is done on the connected device. The training of the system (and subsequent selection of the most accurate classification architecture) was performed by recording volunteers typing on an A3 keyboard image, tracking finger movements with a motion recording camera, while the acceleration data streams from both wrists. were recorded. There are a few more details in the published article for those interested in delving a little deeper into this research.

The eagle eyes may recall something similar from last year, from the same team, who correlated bone conduction detection with VR-type hand tracking to generate input events in a VR environment.

Leave a Comment