Researchers Propose an All-Optical 100GHz Computer for Energy-Efficient Artificial Intelligence

The specter of Moore's Law's end leads to a new approach to high-performance computing, with potential for generative AI breakthroughs.

Gareth Halfacree
2 months ago β€’ Machine Learning & AI / HW101

Researchers at the California Institute of Technology, NTT Research, and the University of Central Florida have offered a glimpse of one potential future for high-performance computing and energy-efficient generative artificial intelligence (gen AI): all-optical systems with clock speeds in excess of 100GHz.

"A computer's clock rate ultimately determines the minimum time between sequential operations or instructions. Despite exponential advances in electronic computer performance owing to Moore's Law and increasingly parallel system architectures, computer clock rates have remained stagnant at ∼5GHz for almost two decades," the researchers write in the abstract to their paper. "This poses an intractable problem for applications requiring real-time processing or control of ultra-fast information systems. Here we break this barrier by proposing and experimentally demonstrating computing based on an end-to-end and all-optical recurrent neural network harnessing the ultra-fast nature of linear and non-linear optical operations while avoiding electronic operations."

Moore's Law is named for Intel co-founder Gordon Moore who observed, even in the company's early days, that the number of transistors on leading-edge microprocessors trended towards a doubling every 18 months. That observation became a target for the semiconductor industry, but as transistor counts grew the component sizes had to shrink to accommodate β€” and now we're hitting hard physical limits that could mean the derailing of Moore's Law and a plateau in computer performance.

There are any number of research projects underway to address this potential brick wall, but the team's work takes an interesting approach: ditching electricity altogether in favor of light-based optical computing. "The all-optical computer realizes linear operations, non-linear functions, and memory entirely in the optical domain with >100GHz clock rates," the team claims of its proposal's ultimate goal.

"We experimentally demonstrate a prototypical task of noisy waveform classification as well as perform ultra-fast in-situ analysis of the soliton states from integrated optical microresonators. We further illustrate the application of the architecture for generative artificial intelligence based on quantum fluctuations to generate images even in the absence of input optical signals. Our results highlight the potential of all-optical computing beyond what can be achieved with digital electronics by utilizing ultrafast linear, non-linear, and memory functions and quantum fluctuations."

A preprint detailing the team's work is available on Cornell's arXiv server; the researchers admit, however, that the project is very much at the proof-of-concept stage, with no roadmap yet provided towards commercialization.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles