186 Views

Review looks at roadmap for neuromorphic computing at scale

LinkedIn Facebook X
January 24, 2025

Get a Price Quote

Leading European researchers have been part of a key project to look at how neuromorphic computing can scale up to address the energy consumption of AI

Prof Steve Furber of the University of Manchester, UK, Hector Gonzalez of SpiNNcloud Systems in Dresden, Germany, Cliff Young at Google DeepMind in London and Melika Payvand, Institute of Neuroinformatics at the University of Zürich and ETH Zürich Are among the team of 24 that looked at the roadmap for neuromorphic computing along with Christian Mayr of the Technische Universität Dresden.

“With this research field at a critical juncture, it is crucial to chart the course for the development of future large-scale neuromorphic systems,” says the review. “We do not anticipate that there will be a one-size-fits-all solution for neuromorphic systems at scale but rather a range of neuromorphic hardware solutions with different characteristics based on application needs.”

  • World’s largest neuromorphic supercomputer in Germany aims at 10bn neurons
  • US sets up science and technology advisory council

The review also includes researchers from Intel Labs which developed the Loihi neuromorphic processor which is being used by Mercedes for automotive applications and for a large neuromorphic system.

“Neuromorphic computing is at a pivotal moment, reminiscent of the AlexNet-like moment for deep learning,” said Dhireesha Kudithipudi of the University of Texas. “We are now at a point where there is a tremendous opportunity to build new architectures and open frameworks that can be deployed in commercial applications. I strongly believe that fostering tight collaboration between industry and academia is the key to shaping the future of this field. This collaboration is reflected in our team of co-authors.”

The paper addresses key points to achieve true scalability, which includes heterogeneous integration, event-based computation and communication, as well as efficient software to handle real-world complexity. 

“At SpiNNcloud Systems, we see brain-inspired computing at a large scale not just as a technological breakthrough, but as our mission to fundamentally transform artificial intelligence. Our hardware-software co-design approach represents more than an incremental improvement as is industry-standard today – it is a paradigm shift that redefines how we think about artificial intelligence. We strongly believe there is not a better time to pursue brain-inspired computing, and we are happy to be among the leaders of this revolution to achieve the next  “AlexNet” moment,” said Hector Gonzalez, co-CEO of SpiNNcloud Systems which is building a neuromorphic supercomputer in Germany. 

Applications for neuromorphic computing include scientific computing, artificial intelligence, augmented and virtual reality, wearables, smart farming, smart cities and more. Neuromorphic chips have the potential to outpace traditional computers in energy and space efficiency, as well as performance. This could present substantial advantages across various domains, including AI, health care and robotics. As the electricity consumption of AI is projected to double by 2026, neuromorphic computing emerges as a promising solution.

“Neuromorphic computing is particularly relevant today, when we are witnessing the untenable scaling of power- and resource-hungry AI systems,” said Gert Cauwenberghs, a Distinguished Professor in the UC San Diego Shu Chien-Gene Lay Department of Bioengineering and one of the paper’s coauthors.

The THOR neuromorphic commons network in the US provides access to open neuromorphic computing hardware and tools in support of interdisciplinary and collaborative research.

To achieve scale in neuromorphic computing, the authors propose several key features that must be optimized, including sparsity. If successfully emulated, this feature could enable neuromorphic systems that are significantly more energy-efficient and compact and has been a key focus for Furber’s work with the European SpiNNaker projects on neuromorphic computing with Mayr. 

“The expandable scalability and superior efficiency derive from massive parallelism and hierarchical structure in neural representation, combining dense local synaptic connectivity within neurosynaptic cores modeled after the brain’s gray matter with sparse global connectivity in neural communication across cores modeling the brain’s white matter, facilitated through high-bandwidth reconfigurable interconnects on-chip and hierarchically structured interconnects across chips,” said Gert Cauwenberghs at the University of California, San Diego.

The review is published in nature here (paywall).

  • Dhireesha Kudithipudi and Tej Pandit, University of Texas, San Antonio
  • Catherine Schuman, University of Tennessee, Knoxville
  • Craig M. Vineyard, James B. Aimone and Suma George Cardwell, Sandia National Laboratories Cory Merkel, Rochester Institute of Technology
  • Rajkumar Kubendran, University of Pittsburgh
  • Garrick Orchard and Ryad Benosman, Intel Labs
  • Christian Mayr, Technische Universität Dresden
  • Joe Hays, U.S. Naval Research Laboratory,
  • Cliff Young, Google DeepMind
  • Chiara Bartolozzi, Italian Institute of Technology
  • Amitava Majumdar and Gert Cauwenberghs, University of California San Diego
  • Sonia Buckley, National Institute of Standards and Technology
  • Shruti Kulkarni, Oak Ridge National Laboratory
  • Chetan Singh Thakur, Indian Institute of Science, Bengaluru  
  • Anand Subramoney, Royal Holloway, University of London, Egham
  • Steve Furber, The University of Manchester

Recent Stories