Papers
Topics
Authors
Recent
2000 character limit reached

Density of States Prediction of Crystalline Materials via Prompt-guided Multi-Modal Transformer (2311.12856v2)

Published 24 Oct 2023 in cond-mat.mtrl-sci, cs.AI, and cs.LG

Abstract: The density of states (DOS) is a spectral property of crystalline materials, which provides fundamental insights into various characteristics of the materials. While previous works mainly focus on obtaining high-quality representations of crystalline materials for DOS prediction, we focus on predicting the DOS from the obtained representations by reflecting the nature of DOS: DOS determines the general distribution of states as a function of energy. That is, DOS is not solely determined by the crystalline material but also by the energy levels, which has been neglected in previous works. In this paper, we propose to integrate heterogeneous information obtained from the crystalline materials and the energies via a multi-modal transformer, thereby modeling the complex relationships between the atoms in the crystalline materials and various energy levels for DOS prediction. Moreover, we propose to utilize prompts to guide the model to learn the crystal structural system-specific interactions between crystalline materials and energies. Extensive experiments on two types of DOS, i.e., Phonon DOS and Electron DOS, with various real-world scenarios demonstrate the superiority of DOSTransformer. The source code for DOSTransformer is available at https://github.com/HeewoongNoh/DOSTransformer.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (69)
  1. Are transformers more robust than cnns? Advances in Neural Information Processing Systems, 34:26831–26843, 2021.
  2. Multimodal machine learning: A survey and taxonomy. IEEE transactions on pattern analysis and machine intelligence, 41(2):423–443, 2018.
  3. Structure motif–centric learning framework for inorganic crystalline systems. Science advances, 7(17):eabf1754, 2021.
  4. Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261, 2018.
  5. Hybrid-covid: a novel hybrid 2d/3d cnn based on cross-domain adaptation approach for covid-19 screening from chest x-ray images. Physical and engineering sciences in medicine, 43(4):1415–1431, 2020.
  6. A survey on deep multimodal learning for computer vision: advances, trends, applications, and datasets. The Visual Computer, 38(8):2939–2970, 2022.
  7. Solving the electronic structure problem with machine learning. npj Computational Materials, 5(1):1–7, 2019.
  8. Thermal conductivity of diamond and related materials from molecular dynamics simulations. The Journal of Chemical Physics, 113(16):6888–6900, 2000.
  9. Direct prediction of phonon density of states with euclidean neural networks. Advanced Science, 8(12):2004214, 2021.
  10. Atomistic line graph neural network for improved materials property predictions. npj Computational Materials, 7(1):185, 2021.
  11. Recent advances and applications of deep learning methods in materials science. npj Computational Materials, 8(1):1–26, 2022.
  12. Density functional theory for transition metals and transition metal chemistry. Physical Chemistry Chemical Physics, 11(46):10757–10816, 2009.
  13. Robust model benchmarking and bias-imbalance in data-driven materials science: a case study on modnet. Journal of Physics: Condensed Matter, 33(40):404002, 2021.
  14. An efficient deep learning scheme to predict the electronic structure of materials and molecules: The example of graphene-derived allotropes. The Journal of Physical Chemistry A, 124(45):9496–9502, 2020.
  15. Origins of structural and electronic transitions in disordered silicon. Nature, 589(7840):59–64, 2021.
  16. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018.
  17. Physically informed machine learning prediction of electronic density of states. Chemistry of Materials, 2022.
  18. Garrity, K. F. First-principles search for n-type oxide, nitride, and sulfide thermoelectrics. Physical Review B, 94(4):045122, 2016.
  19. Neural message passing for quantum chemistry. In International conference on machine learning, pp.  1263–1272. PMLR, 2017.
  20. Dataset bias in the natural sciences: a case study in chemical reaction prediction and synthesis design. arXiv preprint arXiv:2105.02637, 2021.
  21. Electronic contribution in heat transfer at metal-semiconductor and metal silicide-semiconductor interfaces. Scientific reports, 8(1):1–9, 2018.
  22. Global context vision transformers. arXiv preprint arXiv:2206.09959, 2022.
  23. Extent and limitations of density-functional theory in describing magnetic systems. Physical Review B, 70(13):132414, 2004.
  24. Visual prompt tuning. In Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XXXIII, pp.  709–727. Springer, 2022.
  25. Could graph neural networks learn better molecular representation for drug discovery? a comparison study of descriptor-based and graph-based models. Journal of cheminformatics, 13(1):1–23, 2021.
  26. Carbon nanotubes: advanced topics in the synthesis, structure, properties and applications, volume 111. Springer, 2008.
  27. Thermal conductivity of diamond composites. Materials, 2(4):2467–2495, 2009.
  28. Clebsch–gordan nets: a fully fourier space spherical convolutional neural network. Advances in Neural Information Processing Systems, 31, 2018.
  29. Density of states prediction for materials discovery via contrastive learning from probabilistic embeddings. Nature communications, 13(1):1–12, 2022.
  30. Effects of data bias on machine-learning–based material discovery using experimental property data. Science and Technology of Advanced Materials: Methods, 2(1):302–309, 2022.
  31. Prediction model of band gap for inorganic compounds by combination of density functional theory calculations and machine learning techniques. Physical Review B, 93(11):115104, 2016.
  32. Deep single-cell rna-seq data clustering with graph prototypical contrastive learning. Bioinformatics, 39(6):btad342, 2023a.
  33. Conditional graph information bottleneck for molecular relational learning. arXiv preprint arXiv:2305.01520, 2023b.
  34. Shift-robust molecular relational learning with causal substructure. arXiv preprint arXiv:2305.18451, 2023c.
  35. A critical examination of robustness and generalizability of machine learning prediction of materials properties. npj Computational Materials, 9(1):55, 2023.
  36. Visualbert: A simple and performant baseline for vision and language. arXiv preprint arXiv:1908.03557, 2019.
  37. Prefix-tuning: Optimizing continuous prompts for generation. arXiv preprint arXiv:2101.00190, 2021.
  38. Learning deep representations for ground-to-aerial geolocalization. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  5007–5015, 2015.
  39. Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing. ACM Computing Surveys, 55(9):1–35, 2023.
  40. Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692, 2019.
  41. Mixture of experts: a literature survey. The Artificial Intelligence Review, 42(2):275, 2014.
  42. High-throughput density-functional perturbation theory phonons for inorganic materials. Scientific data, 5(1):1–12, 2018.
  43. Zero-shot text-to-image generation. In International Conference on Machine Learning, pp.  8821–8831. PMLR, 2021.
  44. Exploiting cloze questions for few shot text classification and natural language inference. arXiv preprint arXiv:2001.07676, 2020.
  45. Application of machine learning methods for the prediction of crystal system of cathode materials in lithium-ion batteries. Computational Materials Science, 117:270–278, 2016.
  46. Quantifying density errors in dft. The journal of physical chemistry letters, 9(22):6385–6392, 2018.
  47. Robust and synthesizable photocatalysts for co2 reduction: a data-driven materials discovery. Nature communications, 10(1):1–9, 2019.
  48. A deep learning approach to antibiotic discovery. Cell, 180(4):688–702, 2020.
  49. Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219, 2018.
  50. Multimodal transformer for unaligned multimodal language sequences. In Proceedings of the conference. Association for Computational Linguistics. Meeting, volume 2019, pp.  6558. NIH Public Access, 2019.
  51. Context gates for neural machine translation. Transactions of the Association for Computational Linguistics, 5:87–99, 2017.
  52. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  53. A simple yet effective learnable positional encoding method for improving document transformer model. In Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022, pp.  453–463, 2022.
  54. Learning deep structure-preserving image-text embeddings. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  5005–5013, 2016.
  55. Exploiting cross-sentence context for neural machine translation. arXiv preprint arXiv:1704.04347, 2017.
  56. A general-purpose machine learning framework for predicting properties of inorganic materials. npj Computational Materials, 2(1):1–7, 2016.
  57. Machine learning in materials science. InfoMat, 1(3):338–358, 2019.
  58. 3d steerable cnns: Learning rotationally equivariant features in volumetric data. Advances in Neural Information Processing Systems, 31, 2018.
  59. Wissler, M. Graphite and carbon powders for electrochemical applications. Journal of power sources, 156(2):142–150, 2006.
  60. Predicting interfacial thermal resistance by machine learning. npj Computational Materials, 5(1):1–8, 2019.
  61. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Physical review letters, 120(14):145301, 2018.
  62. Multimodal learning with transformers: a survey. arXiv preprint arXiv:2206.06488, 2022.
  63. Periodic graph transformers for crystal material property prediction. Advances in Neural Information Processing Systems, 35:15066–15080, 2022.
  64. Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment. Proceedings of the National Academy of Sciences, 114(12):3040–3043, 2017.
  65. Multimodal transformer for multimodal machine translation. In Proceedings of the 58th annual meeting of the association for computational linguistics, pp.  4346–4350, 2020.
  66. Delving deep into the generalization of vision transformers under distribution shifts. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.  7277–7286, 2022.
  67. Explainable machine learning in materials science. npj Computational Materials, 8(1):1–19, 2022.
  68. Learning to prompt for vision-language models. International Journal of Computer Vision, 130(9):2337–2348, 2022.
  69. Predicting the band gaps of inorganic solids by machine learning. The journal of physical chemistry letters, 9(7):1668–1673, 2018.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Video Overview

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.