Arm has announced another push into the on-device edge machine learning and artificial intelligence markets with a pair new of core intellectual properties (IPs) — including the company's first "microNPU" neural networking co-processor aimed at microcontrollers and low-power microprocessors, the Ethos-U55.
"Enabling AI everywhere requires device makers and developers to deliver machine learning locally on billions, and ultimately trillions of devices," explains Dipti Vachani, senior vice president and general manager for the Automotive and IoT Line of Business at Arm. "With these additions to our AI platform, no device is left behind as on-device ML on the tiniest devices will be the new normal, unleashing the potential of AI securely across a vast range of life-changing applications."
Key to this is the Ethos-U55, first in the company's new range of "microNPU" neural networking accelerator co-processors for microcontrollers. Combined with the newly-announced Cortex-M55, Arm claims the Ethos-U55 can deliver a 480x increase in machine learning performance over the company's existing Cortex-M range.
The Cortex-M55, meanwhile, is no slouch: The company's latest low-power microprocessor core is the first to be based on the Armv8.1-M architecture, incorporating the company's Helium vector processing instructions for a claimed 15x increase in machine learning performance — without the aid of the Ethos-u55 — and a fivefold boost for digital signal processing (DSP) over existing Cortex-M core IP.
Arm has confirmed that both the Cortex-M55 and Ethos-U55 are fully compatible at launch with the company's existing Cortex-M software toolchain, while integration and optimisation for machine learning frameworks is ongoing with TensorFlow Lite Micro the first to benefit.