GPUTB: Efficient Machine Learning Tight-Binding Method for Large-Scale Electronic Properties Calculations (2509.06525v1)
Abstract: The high computational cost of ab-initio methods limits their application in predicting electronic properties at the device scale. Therefore, an efficient method is needed to map the atomic structure to the electronic structure quickly. Here, we develop GPUTB, a GPU-accelerated tight-binding (TB) machine learning framework. GPUTB employs atomic environment descriptors, enabling the model parameters to incorporate environmental dependence. This allows the model to transfer to different basis, xc-functionals, and allotropes easily. Combined with the linear scaling quantum transport method, we have calculated the electronic density of states for up to 100 million atoms in pristine graphene. Trained on finite-temperature structures, the model can be easily extended to millions of atom finite-temperature systems. Furthermore, GPUTB can also successfully describe h-BN/graphene heterojunction systems, demonstrating its capability to handle complex material with high precision. We accurately reproduce the relationship between carrier concentration and room temperature mobility in graphene to verify the framework's accuracy. Therefore, our GPUTB framework presents a delicate balance between computational accuracy and efficiency, providing a powerful computational tool for investing electronic properties for large systems with millions of atoms.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.