Publication date: 17th July 2025
Deep-learning ab initio calculation is an emerging interdisciplinary field, which aims to greatly enhance the capability of ab initio methods by using state-of-theart neural-network approaches. Among these developments, deep-learning density functional theory (DFT) stands out as a particularly transformative direction, showing great promise to significantly accelerate material discovery and potentially revolutionize materials research. However, current research in this field primarily relies on data-driven supervised learning, which isolates the development of neural networks and DFT from each other, hindering their further advancement. In this work, we present a theoretical framework of neural-network DFT, which unifies the optimization of neural networks with the variational computation of DFT, enabling physics-informed unsupervised learning. Moreover, we develop a differential DFT code incorporated with deep-learning DFT Hamiltonian, and introduce algorithms of automatic differentiation and backpropagation into DFT, demonstrating the capability of neuralnetwork DFT. The physics-informed neural-network architecture not only surpasses conventional approaches in accuracy and efficiency, but also offers a new paradigm for developing deep-learning DFT methods.