Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving neural network predictions of material properties with limited data using transfer learning (2006.16420v1)

Published 29 Jun 2020 in physics.comp-ph, cs.LG, and physics.chem-ph

Abstract: We develop new transfer learning algorithms to accelerate prediction of material properties from ab initio simulations based on density functional theory (DFT). Transfer learning has been successfully utilized for data-efficient modeling in applications other than materials science, and it allows transferable representations learned from large datasets to be repurposed for learning new tasks even with small datasets. In the context of materials science, this opens the possibility to develop generalizable neural network models that can be repurposed on other materials, without the need of generating a large (computationally expensive) training set of materials properties. The proposed transfer learning algorithms are demonstrated on predicting the Gibbs free energy of light transition metal oxides.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Schuyler Krawczuk (1 paper)
  2. Daniele Venturi (30 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.