This Macro Keyboard Incorporates Machine Learning to Interpret Gestures

Jakob Krantz's DIY macro keyboard incorporates a number of unique features, including machine learning for gesture detection.

The beauty — and real power — of computing is automation. While computers make on-off tasks, such as performing a calculation, easy, they are phenomenal at performing repetitive jobs. For example, you can use a well-designed spreadsheet template over and over again to save yourself dozens of hours of work every year. A macro, which is a series of automated actions, like mouse clicks and copying text, can save you time and effort in a similar way. Macro keyboards are dedicated hardware devices capable of launching those macros. Jakob Krantz's DIY macro keyboard incorporates a number of unique features, including machine learning for gesture detection.

We have featured several other macro keyboards over the years and they almost always contain a handful of configurable buttons for entering shortcuts or launching macros. Krantz's device has six buttons of that type, but it also has other features that we don't see often. Those include a capacitive touch control bar for controlling a connected computer's volume, an OLED screen capable of showing information like current cryptocurrency prices or the device's status, and the real star of the show: a touchpad for drawing gestures.

Users can train a machine learning model to recognize whatever gestures they like. You could, for instance, train it to launch Microsoft Excel, open a spreadsheet, enter a selected value, and print that spreadsheet whenever you draw an "X" on the touchpad.

This macro keyboard's brain is an ESP32 development board. It has a built-in Bluetooth adapter and that lets the device connect to a computer as a standard Bluetooth HID keyboard. The display is a small SSD1306-based OLED screen with a resolution of 128x32. Both the volume control capacitive touch sensor and the gesture control touchpad sensor come from Bela's Trill series. The key switches are Cherry MX-compatible models with RGB LED backlighting and they accept a range of key caps.

Reading button presses and the one-dimensional volume slider is straightforward for the ESP32, but the gesture control recognition is a bit different. TensorFlow Lite machine learning monitors the coordinates from the touchpad. The resolution of the touchpad is scaled down to 28x28 to make patterns easier to recognize. Once the model is trained, it will register a gesture and launch the appropriate AutoHotKey script — just like what happens when the user presses a button.

The gesture and volume control make this project a lot more interesting than your typical macro keyboard, because they offer more flexibility. If you want to build your own, Krantz published all of his files, including those for 3D printing an enclosure, on GitHub.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles