Can Edge Devices See, Hear and Think?

TinyML, a cyborg fantasy for edge devices. Innovations from Google, Arm, Edge Impulse, and others are game changers for edge computing.

Every second, hundreds of new things are connected to the Internet, and millions more are being connected every month. It’s a figure hard to enumerate, but there are billions of edge devices in our world and growing. These single-purpose electronics are generally underwhelming but necessary for our lives and quietly go unnoticed in our cars, walls, phones, medical devices, and agricultural machines. Most edge devices are virtually deaf and blind and have to rely on other sensors for data and pattern recognition that would be easy for a thinking, seeing and hearing computer. The future for these billions of edge devices is about to drastically change with new innovations from companies such as Google and Edge Impulse, who are giving basic hardware eyes, ears, and brains using Tiny Machine Learning (TinyML).

Intelligence on the edge

Computer vision and audio processing began 70 years ago with early neural networks detecting objects and sorting them by shapes and sound. Some of the first computer vision applications were used to interpret written text, using optical character recognition (OCR). Energy efficiency has been a constraint for deploying intelligent sensing in battery-powered systems and for years companies have tried to identify the most energy-efficient technologies for detecting objects in images and sound. To solve this problem, companies like Arm and Ambiq Micro are changing how power is managed by turning “off” the "always-on" traditional microcontroller power leakage. This methodology delivers energy profiles measured in nanoamps, which is fully complemented by TinyML models that trigger accurate wake words, changes in event detection, and more.

Take Edge Impulse, for example, a Silicon Valley startup working on building solutions that can retrofit existing edge devices with computer vision, audio, and machine learning models. This allows embedded developers to collect images and sound data from tiny components to interpret the world and deploy models back to the device enabling it to think and act in ways that are less binary and more neural. Edge Impulse is able to capture up to 90% of discarded edge devices data that go unnoticed and organize it into an operational model that can be deployed back at the device to create a new set of rules and actions. This means that a whole new class of devices measured in billions can now accurately predict what they're seeing and hearing for basic human-like cognition, like: “Has a fragile piece of machinery been dropped in a way that could have damaged it?” Or even, “Is this piece of hospital equipment moving on a path and being used where and how it should?” or “Can we improve the positioning accuracy of assets by combining GPS and accelerometer-based data with position history using ML?” Giving simple devices the ability to answer yes or no questions (in the billions) will radically change the way we interact with our hardware and our world.

Now more than ever, intelligent devices can keep us healthy

In healthcare, computer vision has been used for years. Did you know that 90% of all medical data is image-based? From routine diagnostics to blood monitoring, ultrasound, and three-dimensional radiological images, to tiny cameras used for arthroscopic surgeries, computer vision is paramount in modern healthcare. So imagine what new opportunities could be opened up by low-powered, tiny devices that can be used as injectables. Devices that can live inside of the human body and provide new insight in ways that not even the human visual system can deliver, like the bug in the matrix.

Consider this scenario: right now there is a serious need for inexpensive, easily deployable solutions for COVID-19 and other flu viruses for the early detection of symptoms that could have an immediate impact on millions of lives around the world. In a recent Hackster.io showcase, developers used TinyML and a simple Arm Cortex M4 CPU to detect unusual coughing as a first defense mechanism for COVID-19 containment, by sampling acoustic changes in a patient’s cough. Today, you can use the same technology, plus a low-cost OpenMV camera to visually assess even more environmental samples than sound alone, adding more detail and accuracy to the models.

TinyML is the cyborg wild fantasy for edge devices

Until now, low-power single-use edge devices didn’t get much attention because we’ve never expected them to do more than a few basic things. But with TinyML and innovations from Edge Impulse, Google, Arm and others, edge computing, computer sounds and vision can be adapted to loads of industry scenarios. With billions of microcontrollers in the world today, and fast-growing annually, tiny devices will act as an extension of our thoughts, feelings, and emotions, a natural part of everyday life. Buckle up!

Adam Benzion
Adam Benzion was the CEO and co-founder of Hackster.io, a serial entrepreneur, writer, and investor. Now CXO at Edge Impulse.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles