Unveiling the critical factors in crystal structure graph representation: a comparative analysis using streamlined MLPSets frameworks (2509.05712v1)
Abstract: Graph Neural Networks have rapidly advanced in materials science and chemistry,with their performance critically dependent on comprehensive representations of crystal or molecular structures across five dimensions: elemental information, geometric topology, electronic interactions, symmetry, and long-range interactions. Existing models still exhibit limitations in representing electronic interactions, symmetry, and long-range information. This study compares physics-based site feature calculators with data-driven graph representation strategies. We find that the latter achieve superior performance in representation completeness, convergence speed, and extrapolation capability by incorporating electronic structure generation models-such as variational autoencoders (VAEs) that compress Kohn-Sham wave functions and leveraging multi-task learning. Notably, the CHGNet-V1/V2 strategies, when integrated into the DenseGNN model,significantly outperform state-of-the-art models across 35 datasets from Matbench and JARVIS-DFT, yielding predictions with accuracy close to that of DFT calculations. Furthermore, applying a pre-training and fine-tuning strategy substantially reduces the prediction error for band gaps of complex disordered materials, demonstrating the superiority and potential of data-driven graph representations in accelerating materials discovery.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.