A team of researchers have published a paper demonstrating a parallel ultra-low power (PULP) processor and convolutional neural network (CNN) that can give an off-the-shelf Crazyflie 2.1 nano-drone "top-notch autonomous navigation capability" — despite its tiny size and weight.
"Artificial intelligence-powered pocket-sized air robots have the potential to revolutionize the Internet of Things ecosystem, acting as autonomous, unobtrusive, and ubiquitous smart sensors," the team claims in the paper's abstract. "With a few cm2 form factor, nano-sized unmanned aerial vehicles (UAVs) are the natural befit for indoor human-drone interaction missions, as the pose estimation task we address in this work."
"However, this scenario is challenged by the nano-UAVs' limited payload and computational power that severely relegates the onboard brain to the sub-100 mW microcontroller unit-class. Our work stands at the intersection of the novel parallel ultra-low-power (PULP) architectural paradigm and our general development methodology for deep neural network (DNN) visual pipelines, i.e., covering from perception to control."
The team's work concentrates on an existing commercial parallel ultra-low power (PULP) processor, the GreenWaves Technologies' GAP8. Designed for edge-AI workloads, the chip includes nine processor cores based on the free and open-source RI5CY implementation of the RV32IMC RISC-V instruction set architecture — plus an extension, XpulpV2, adding hardware loops, address post-incrementation for load and store operation, single instruction multiple data (SIMD) vectorial arithmetic, and dot product instructions for eight- and 16-bit data.
On this, the team run a custom convolutional neural network (CNN) called PULP-Frontnet. Inspired by the Proximity network, PULP-Frontnet is tailored for low ower use and tied in to the QVGA grayscale camera built into the target Crazyflie 2.1 nano-drone. Despite needing to keep the whole system capable of running on the GAP8 chip, performance proved impressive: The network proved able to run at up to 135 frames per second at a power draw of just 86mW and a memory footprint of just 184kB, giving the drone "top-notch autonomous navigation capability" - even to the point of being able to estimate the pose of a human wandering around the flight zone.
"[PULP-Frontnet was] controlling the robot to stay at a constant distance in front of them," the team concludes. "Solving this HDI [human-drone interaction] problem on an autonomous nano-drone is a challenging and valuable task in the IoT domain. These robotic helpers can be envisioned as the next-generation ubiquitous IoT devices, ideal for indoor operations near humans."
The team's work is now available under open-access terms on arXiv.org.