This Wearable Patch Could Give ALS Sufferers the Ability to Communicate More Effectively

MIT researchers developed a stretchable, skin-like device that can be attached to a patient’s face and can measure small movements.

Researchers estimate that each device would cost around $10. (📷: David Sadat)

Amyotrophic lateral sclerosis (ALS) is a particularly tragic disease that affects that nervous system and causes a loss of muscle control. Symptoms are often minor in the early stages of the disease, but ALS is progressive and those symptoms can get dramatically worse over time. There is also no way to reverse the damage that has already been done, and current treatments can’t do anything except slow the progression. In later stages, ALS often makes it difficult for people to communicate, as it can become impossible to control the muscles necessary to speak or move limbs. That’s why a team of researchers from MIT have developed a wearable patch equipped with sensors to help ALS sufferers communicate more effectively.

For many years, Stephen Hawking was probably the most recognizable face of ALS. Hawking lost his ability to speak in 1985 and began using a computer to synthesize speech. Originally, that was controlled by a hand clicker. But after the disease progressed further, Hawking was no longer able to control his thumbs well enough to operate even that and had to use a special device with a “cheek switch.” Unfortunately, many people with ALS don’t have enough facial control to take advantage of similar devices. This patch, when applied to a patient’s cheek, is able to detect even the smallest facial movements. Those light muscle twitches are recognized and then used to communicate on at least a limited basis.

The patch is made from a thin silicone film that is flexible and stretchable. It’s comfortable to wear for extended periods of time and can even be covered in makeup, making it practically invisible. The patch is embedded with four piezoelectric sensors that are able to detect even slight deformations of the skin that are caused by facial movement. A machine learning model was created that is able to infer the user’s facial expression based on the sensor readings. At this time, it can differentiate between three unique expressions: a smile, an open mouth, and pursed lips. Users could make combinations of facial expressions to communicate more complex ideas. The patch was tested on two people with ALS and it was able to recognize those expressions with 75 percent accuracy. Most importantly, this patch is affordable. It would only cost about $10, making it much more accessible than other technologies designed for similar purposes.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist. Check out my YouTube channel: Serial Hobbyism
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles