I'm Sensing a Pattern
A pair of open source biosensing tools make the exploration of muscle movement data accessible for prosthetic control algorithm development.
Biosensing and machine learning are a match made in heaven. Much of the signaling done in the human body is electrical, which means we can measure these signals with relatively simple sensing equipment. However, the meaning behind these signals is exceedingly complex and difficult to interpret in many cases. Untangling the complex interplay between numerous electrical impulses firing in parallel is beyond our ability to understand. That is where machine learning comes in β these algorithms are ideal for recognizing complex patterns.
The set of tools required makes biosensing a surprisingly approachable field, even for the casual hobbyist. Both the sensing equipment and the software toolkits and hardware required to run the machine learning algorithms are well within reach of nearly anyone that is interested. And now another accessible sensing option has been developed by electronics enthusiast and Hackaday.io user TURFPTAx, who has recently created the Open Muscle Sensor Bracelet (OM-12) and Open Muscle Labeler (LASK5). These open source tools were created to assist in the exploration of muscle sensing and machine learning in prosthetic technologies.
The OM-12 Sensor Bracelet is a wristband equipped with a set of three cells, each containing ESP32-C3 microcontrollers, Hall effect sensors, and mechanical keyboard switches. These specialized cells detect even small changes in muscle pressure to reveal forearm muscle contractions in real-time. Three more cells are equipped with power management and battery circuits. While the Sensor Bracelet is intended for experimental use only, it provides an effective platform for testing prosthetic control algorithms. MicroPython support further eases this development process.
The LASK5 Labeler is a small device that works with the Sensor Bracelet by labeling the data that it produces. This added annotation is necessary in preparing the data for analysis by a machine learning algorithm. The Labeler is also powered by ESP32-S3 microcontroller and includes Hall effect sensors. The inclusion of a display and buttons form a simple user interface that further simplifies the labeling process. In practice, a user would leverage the Labeler to identify what they are doing (e.g., a specific muscle movement) as they collect biosensor data β an essential data point when training supervised learning algorithms.
TURFPTAx believes that these tools will be useful to others that are designing prosthetic control systems or a variety of human-machine interfaces. If that happens to be you (or might be you in the future), this toolkit is worth checking out. Design files, schematics, and the bill of materials are all freely available on GitHub.