Publication date: 15th April 2025
New hardware technologies are necessary to keep up with the major demand in complexity and energy efficiency required by artificial intelligence algorithms. Brain-inspired computing is very promising, since the brain is capable of continuously learning from new data, while consuming only a small fraction of energy that would be consumed in the existing artificial intelligence systems. The reason could be that current artificial intelligence systems have limited biomimicry, as exemplified by deep neural networks with simplistic neuronal and synaptic dynamics. Their training is usually executed in microprocessors, graphics processing units or tensor processing units based on existing digital hardware technologies with poor energy efficiency for these tasks. By comparison, emerging analog hardware technologies, like memristors, show promise for dense energy efficient systems given their ultra-scalable footprint and better energy/bit consumption. I will present our memristor synaptic devices, the integration with CMOS neuronal circuitry as well as our modular mixed-signal prototyping platform which was collaboratively designed for benchmarking memristive neural networks of up to 20,000 memristor devices. However, issues related to device non-idealities prevent practical adoption at scale. By comparison, work in neuroscience seems to indicate that the brain can perform astounding computation with unreliable synapses and in the presence of input noise. The hippocampus is a core region of the brain with a major role in learning and memory and it features of diversity of neuronal types with various connectivity patterns. I will describe our efforts to draw inspiration from the hippocampus to design efficient and robust neuromorphic computing. I will summarize our interdisciplinary efforts across the innovation stack, from new types of materials and synaptic devices to new types of prototyping systems and hippocampal-inspired algorithms.
This work was supported in part by the Department of Energy Office of Science, under grant numbers DE-FOA-0003176, DE-SC00023000 (GWU) and DE-SC0022998 (GMU), by AFOSR YIP under grant FA9550-23-1-0173, by NSF CAREER under grant 2239951, by NIST under grant 70NANB22H018, by Western Digital under grant ECNS21932N and by the GW Cross-Disciplinary Research Fund.