DeepH-Zero: Neural-network Density Functional Theory
Yang Li a
a Department of Physics, Tsinghua University, Beijing, 100084, China
Proceedings of MATSUS Fall 2025 Conference (MATSUSFall25)
D.13 Theory and Modelling for Next-Generation Energy Materials - #TMEM
València, Spain, 2025 October 20th - 24th
Organizer: Shuxia Tao
Invited Speaker, Yang Li, presentation 038
Publication date: 17th July 2025

Deep-learning ab initio calculation is an emerging interdisciplinary field, which aims to greatly enhance the capability of ab initio methods by using state-of-theart neural-network approaches. Among these developments, deep-learning density functional theory (DFT) stands out as a particularly transformative direction, showing great promise to significantly accelerate material discovery and potentially revolutionize materials research. However, current research in this field primarily relies on data-driven supervised learning, which isolates the development of neural networks and DFT from each other, hindering their further advancement. In this work, we present a theoretical framework of neural-network DFT, which unifies the optimization of neural networks with the variational computation of DFT, enabling physics-informed unsupervised learning. Moreover, we develop a differential DFT code incorporated with deep-learning DFT Hamiltonian, and introduce algorithms of automatic differentiation and backpropagation into DFT, demonstrating the capability of neuralnetwork DFT. The physics-informed neural-network architecture not only surpasses conventional approaches in accuracy and efficiency, but also offers a new paradigm for developing deep-learning DFT methods.

© FUNDACIO DE LA COMUNITAT VALENCIANA SCITO
We use our own and third party cookies for analysing and measuring usage of our website to improve our services. If you continue browsing, we consider accepting its use. You can check our Cookies Policy in which you will also find how to configure your web browser for the use of cookies. More info