Accelerating superconductor discovery through tempered deep learning of the electron-phonon spectral function (2401.16611v1)
Abstract: Integrating deep learning with the search for new electron-phonon superconductors represents a burgeoning field of research, where the primary challenge lies in the computational intensity of calculating the electron-phonon spectral function, $\alpha2F(\omega)$, the essential ingredient of Midgal-Eliashberg theory of superconductivity. To overcome this challenge, we adopt a two-step approach. First, we compute $\alpha2F(\omega)$ for 818 dynamically stable materials. We then train a deep-learning model to predict $\alpha2F(\omega)$, using an unconventional training strategy to temper the model's overfitting, enhancing predictions. Specifically, we train a Bootstrapped Ensemble of Tempered Equivariant graph neural NETworks (BETE-NET), obtaining an MAE of 0.21, 45 K, and 43 K for the Eliashberg moments derived from $\alpha2F(\omega)$: $\lambda$, $\omega_{\log}$, and $\omega_{2}$, respectively, yielding an MAE of 2.5 K for the critical temperature, $T_c$. Further, we incorporate domain knowledge of the site-projected phonon density of states to impose inductive bias into the model's node attributes and enhance predictions. This methodological innovation decreases the MAE to 0.18, 29 K, and 28 K, respectively, yielding an MAE of 2.1 K for $T_c$. We illustrate the practical application of our model in high-throughput screening for high-$T_c$ materials. The model demonstrates an average precision nearly five times higher than random screening, highlighting the potential of ML in accelerating superconductor discovery. BETE-NET accelerates the search for high-$T_c$ superconductors while setting a precedent for applying ML in materials discovery, particularly when data is limited.
- Materials Genome Initiative, https://obamawhitehouse.archives.gov/mgi (2014), accessed: 2024-01-08.
- C. Yao and Y. Ma, Superconducting materials: Challenges and opportunities for large-scale applications, iScience 24, 102541 (2021).
- A. Sahinovic and B. Geisler, Active learning and element-embedding approach in neural networks for infinite-layer versus perovskite oxides, Phys. Rev. Research 3, L042022 (2021).
- J. Gibson, A. Hire, and R. G. Hennig, Data-augmentation for graph neural network learning of the relaxed energies of unrelaxed structures, npj Computational Materials 8, 10.1038/s41524-022-00891-8 (2022).
- T. Xie and J. C. Grossman, Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties, Phys. Rev. Lett. 120, 145301 (2018).
- N. W. Ashcroft, Hydrogen dominant metallic alloys: High temperature superconductors?, Phys. Rev. Lett. 92, 187002 (2004).
- P. B. Allen and R. C. Dynes, Transition temperature of strong-coupled superconductors reanalyzed, Phys. Rev. B 12, 905 (1975).
- A. B. Migdal, Interaction between electrons and lattice vibrations in a normal metal, Sov. Phys. JETP 7, 996 (1958), [Zh. Eksp. Teor. Fiz. 34, 1438 (1958)].
- G. M. Eliashberg, Interactions between electrons and lattice vibrations in a superconductor, Sov. Phys. JETP 11, 696 (1960), [Zh. Eksp. Teor. Fiz. 38, 966 (1960)].
- G. M. Eliashberg, Temperature Green’s Function For Electrons In a Supercondutor, Sov. Phys. JETP 12, 1000 (1961).
- B. Roter and S. Dordevic, Predicting new superconductors and their critical temperatures using machine learning, Physica C: Superconductivity and its Applications 575, 1353689 (2020).
- E. Kim and S. V. Dordevic, ScGAN: A Generative Adversarial Network to Predict Hypothetical Superconductors, arXiv e-prints , arXiv:2209.03444 (2022), arXiv:2209.03444 [cond-mat.supr-con] .
- National Institute for Materials Science, SuperCon Database, https://mdr.nims.go.jp/collections/5712mb227 (2011).
- K. Hamidieh, A data-driven statistical model for predicting the critical temperature of a superconductor, Computational Materials Science 154, 346 (2018).
- T. F. T. Cerqueira, A. Sanna, and M. A. L. Marques, Sampling the materials space for conventional superconducting compounds, Advanced Materials 10.1002/adma.202307085 (2023).
- K. Choudhary and B. DeCost, Atomistic line graph neural network for improved materials property predictions, npj Computational Materials 7, 185 (2021).
- P. B. Allen, Electron-phonon effects in the infrared properties of metals, Phys. Rev. B 3, 305 (1971).
- P. B. Allen, New method for solving boltzmann’s equation for electrons in metals, Phys. Rev. B 17, 3725 (1978).
- W. E. Pickett, Colloquium : Room temperature superconductivity: The roles of theory and materials design, Reviews of Modern Physics 95, 10.1103/revmodphys.95.021001 (2023).
- B. Efron, Bootstrap Methods: Another Look at the Jackknife, The Annals of Statistics 7, 1 (1979).
- Y. Rubner, C. Tomasi, and L. Guibas, A metric for distributions with applications to image databases, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271) 10.1109/iccv.1998.710701 (1998).
- S. R. Xie, M. Rupp, and R. G. Hennig, Ultra-fast interpretable machine-learning potentials, npj Computational Materials 9, 162 (2023).
- M. Wierzbowska, S. de Gironcoli, and P. Giannozzi, Origins of low- and high-pressure discontinuities of tcsubscript𝑡𝑐t_{c}italic_t start_POSTSUBSCRIPT italic_c end_POSTSUBSCRIPT in niobium (2005).
- D. R. Hamann, Optimized norm-conserving vanderbilt pseudopotentials, Physical Review B 88, 10.1103/physrevb.88.085117 (2013).
- M. Schlipf and F. Gygi, Optimization algorithm for the generation of ONCV pseudopotentials, Computer Physics Communications 196, 36 (2015).
- SG15 ONCV Pseudopotential, http://www.quantum-simulation.org/potentials/sg15_oncv/ (2023).