It's Alive!

Using biological principles like cell plasticity and gene regulatory networks, researchers have developed energy-efficient tinyML models.

Nick Bild
2 months agoMachine Learning & AI

TinyML is a rapidly growing field that focuses on implementing machine learning algorithms on extremely small, low-power devices. As the demand for smart and connected devices grows, there is an increasing need for machine learning capabilities to be embedded directly into these devices, ranging from wearables and IoT devices to medical implants and environmental sensors. TinyML aims to make this possible by optimizing algorithms and models to run efficiently on devices with limited computational resources, memory, and energy.

Important applications of tinyML span across numerous domains. In healthcare, for example, tinyML enables the development of wearable devices capable of monitoring vital signs, detecting anomalies, and even predicting health-related events such as seizures or cardiac arrhythmias in real-time. In agriculture, tinyML-powered sensors can monitor soil conditions, crop health, and environmental factors to optimize farming practices and maximize yields. Additionally, tinyML finds applications in industrial IoT for predictive maintenance, in smart homes for energy management and security, and in wildlife conservation for tracking and monitoring endangered species.

By running machine learning algorithms directly on-device, tinyML reduces the need for data transmission to centralized servers, thereby minimizing energy consumption associated with wireless communication. Furthermore, optimized algorithms and hardware implementations ensure that computations are performed efficiently, leading to longer battery life and reduced environmental impact. This energy efficiency is particularly crucial for battery-powered and resource-constrained devices, where prolonged operation is essential.

Despite the rapid progress and significant achievements in tinyML, the artificial neural networks used in these systems are still far less efficient and capable than biological systems. For this reason, a team led by researchers at the University of Nebraska-Lincoln and the University of Cambridge have been exploring how to leverage certain biological systems for performing computations. They previously investigated gene regulatory networks, and found that on being supplied with certain inputs, in the form of chemicals, specific outputs, like proteins, can be produced.

It was shown that these systems could be utilized to perform some types of computations, and that these computations could aid in decision-making processes. But that previous work ignored a crucial component of the organisms they were working with — namely, cell plasticity. Cell plasticity allows cells to alter themselves in response to environmental cues. The team found that this capability could be leveraged to build artificial neural network-like systems.

This process still relies on gene regulatory networks, so no, the artificial neural network is not built into a natural neural network. Rather, by supplying inputs, in the form of chemical signals, genes can be stimulated to increase (or suppress) their transcription. This leads to a cascade of events, in which transcription factors (which act like the weights of a silicon neural network) modulate the action of yet other genes. The system acts like a large, many-layered neural network and produces outputs in the form of proteins and nucleic acids.

This is an interesting observation, but perhaps not especially relevant to tinyML in and of itself. But it was also demonstrated that a “trained” network that can perform a specific, useful, task can be found by searching through subsets of genes present in a regulatory network through a process something like a traditional Network Architecture Search. This search process was facilitated through observing the chemical and temporal plasticity of cells, which alters expression pathways in a way that highlights relevant subnetworks.

These techniques were used to demonstrate the feasibility of creating regression models. It was also shown that these biological-based networks were more energy-efficient than either traditional von Neumann, and even neuromorphic, computing architectures. There is still much more work to be done before our artificial neural networks are closer to their natural counterparts, but the future possibilities are intriguing.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles