DeepH-pack#
A general-purpose neural network package for deep-learning electronic structure calculations
DeepH-pack represents the culmination of multi-generational research efforts from the DeepH team at Tsinghua University. This latest iteration of DeepH unites all preceding methodologies into a cohesive, JAX-rewritten package, achieving comprehensive maturity through rigorous long-term testing across all neural modules.
At its core, DeepH-pack features a JAX-based implementation with static computational graphs and advanced algorithms, delivering exceptional performance in runtime, precision, and memory efficiency. Looking forward, the development roadmap envisions seamless integration of multi-framework backends, evolving into an extensible computational ecosystem for quantum materials modeling while preserving signature accuracy in Hamiltonian construction.
The platform is dedicated to constructing an expanded and more comprehensive toolkit for materials computation and predictive modeling, warmly welcoming community feedback. We have open-sourced the core data interfaces and standardization modules from DeepH calculations, establishing the DeepH-dock project. This initiative delivers seamless interoperability with mainstream DFT software packages, while the codebase integrates community-contributed enhancements at the architecture level.
Features#
Leverages JAX static computational graphs and innovative neural algorithms to achieve exceptional runtime efficiency, precision, and memory utilization for large-scale quantum materials simulations.
Built upon years of cumulative research, with rigorous testing across all neural modules ensuring comprehensive maturity and production-ready stability for scientific applications.
Designed as a foundation for quantum materials modeling, with a roadmap for multi-framework backend integration and cross-platform compatibility while maintaining signature Hamiltonian accuracy.
DeepH integrates diverse ab initio materials computation software, eliminating technical barriers to establish standardized workflows that accelerate convergence in computational materials science.
Installation#
Before installing DeepH-pack, ensure that uv — a fast and versatile Python package manager — is properly installed and configured, and that your Python 3.13 environment is set up. If you plan to run DeepH in a GPU-accelerated environment, you must also pre-install CUDA 12.8 or 12.9.
uv pip install ./deepx-1.0.6+light-py3-none-any.whl[gpu] --extra-index-url https://download.pytorch.org/whl/cpu
For step-by-step detailed procedures, please refer to the Installation & Setup.
Basic usage#
For command-line usage:
deeph-train train.toml
deeph-infer infer.toml
For comprehensive information beyond basic usage, refer to the following key sections of the documentation:
Core Workflows: Details the essential computational steps of DeepH.
Configuration Options: Explains all available parameters in the user input (TOML) files.
Universal Material Model: Describes the usage of generalized pre-trained models.
Examples: Contains various practical training and inference examples.
Citation#
If you use DeepH-pack in your work, please cite the following publications.
The original framework paper introduced the foundational methodology.
Complete package featuring the latest implementation, methodology, and workflow.
@article{li2022deep,
title={Deep-learning density functional theory Hamiltonian for efficient ab initio electronic-structure calculation},
author={Li, He and Wang, Zun and Zou, Nianlong and Ye, Meng and Xu, Runzhang and Gong, Xiaoxun and Duan, Wenhui and Xu, Yong},
journal={Nat. Comput. Sci.},
volume={2},
number={6},
pages={367},
year={2022},
publisher={Nature Publishing Group US New York}
}
@article{li2026deeph,
title={DeepH-pack: A general-purpose neural network package for deep-learning electronic structure calculations},
author={Li, Yang and Wang, Yanzhen and Zhao, Boheng and Gong, Xiaoxun and Wang, Yuxiang and Tang, Zechen and Wang, Zixu and Yuan, Zilong and Li, Jialin and Sun, Minghui and Chen, Zezhou and Tao, Honggeng and Wu, Baochun and Yu, Yuhang and Li, He and da Jornada, Felipe H. and Duan, Wenhui and Xu, Yong },
journal={arXiv preprint arXiv:2601.02938},
year={2026}
}