Researchers Turn to the Human Brain for Solutions to Moore's Law's Slowdown in Modern Computing

With Moore's Law stalling, Suhas Kumar and Jack Kendall argue for brain-like neuromorphic computing becoming the dominant paradigm.

Kumar and Kendall claim that neuromorphic primitives are necessary for the future of computing (📷: Kumar et al)

Researchers from Rain Neuromorphics and Hewlett Packard Labs have released a paper describing what they call "the building blocks of a brain-inspired computer," as a novel approach to solving the limits hitting Moore's Law.

Coined as an observation by Intel co-founder Gordon Moore, Moore's Law describes the trend turned must-follow rule of the semiconductor industry for the number of transistors on a leading-edge part to double roughly every 18 months. Over the years Moore's Law has remained true, but as semiconductor process nodes shrink into the single-digit nanometres certain facts of physics have begun to make the process increasingly difficult to continue — putting the future of high-performance computing at risk, unless solutions can be found.

"The future of computing will not be about cramming more components on a chip," claims Hewlett Packard Labs' Suhas Kumar of the approach described in a paper written with Rain Neuromorphic's Jack Kendall, "but in rethinking processor architecture from the ground up to emulate how a brain efficiently processes information."

"Solutions have started to emerge which replicate the natural processing system of a brain," adds co-author Kendall, "but both the research and market spaces are wide open."

"Computers have undergone tremendous improvements in performance over the last 60 years, but those improvements have significantly slowed down over the last decade, owing to fundamental limits in the underlying computing primitives," the pair's paper explains in its abstract. "However, the generation of data and demand for computing are increasing exponentially with time. Thus, there is a critical need to invent new computing primitives, both hardware and algorithms, to keep up with the computing demands.

"The brain is a natural computer that outperforms our best computers in solving certain problems, such as instantly identifying faces or understanding natural language. This realisation has led to a flurry of research into neuromorphic or brain-inspired computing that has shown promise for enhanced computing capabilities."

The pair aren't the only ones who think neuromorphic brain-like primitives could be the future: BrainChip showcased its own Akida neuromorphic processor at the Linley Fall Conference late last year, promising considerable efficiency gains for spiking neural network (SNN) workloads over traditional processor types, while Intel has been working on its own Loihi neuromorphic processor.

Kumar and Kendall's paper, however, doesn't describe any single complete solution; instead, it's designed to act as "both a guidebook for newcomers to the field to determine which new directions to pursue" and "inspiration for those looking for new solutions to the fundamental limits of ageing computing paradigms," while also predicting that neuromorphic computing will become dominant by the mid-2020s.

The paper is available for open access via the journal Applied Physics Reviews now.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles