Publication date: 15th December 2025
The rapid expansion of machine learning capabilities is driven by the exponentially increasing complexity of deep neural network (DNN) models, which demand hardware that is both energy‑efficient and chip‑area efficient to handle computationally intensive inference and training tasks. Electrochemical random‑access memories (ECRAMs) have emerged as a promising solution, specifically designed to enable efficient analog in‑memory computing for these data‑intensive workloads. In this talk, I will present a CMOS‑compatible ECRAM prototype fabricated with inorganic metal oxides. The device operates by shuttling protons within a symmetric gate stack composed of a zirconium oxide protonic electrolyte sandwiched between a hydrogenated tungsten oxide channel and gate. This architecture yields nearly perfectly symmetric programming characteristics with exceptionally low cycle‑to‑cycle variability (<1%) under voltage pulse operation. By optimizing zirconium oxide stoichiometry, the prototype achieves fast operation with latency down to 100 nanoseconds, endurance exceeding 10⁸ cycles, robust retention of analog memristive states. and ultra‑low energy consumption (<1 femtojoule per weight update). These ECRAMs can be monolithically integrated on top of silicon electronics to form pseudo‑crossbar arrays. The test chips function as in-memory computing processing elements to accelerate both inference and training of deep neural networks.
This work is supported by the National Science Foundation (NSF) through grant FuSe-2329096.
