Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Thermodynamic Consistent Neural Networks for Learning Material Interfacial Mechanics (2011.14172v1)

Published 28 Nov 2020 in cs.CE and cs.LG

Abstract: For multilayer materials in thin substrate systems, interfacial failure is one of the most challenges. The traction-separation relations (TSR) quantitatively describe the mechanical behavior of a material interface undergoing openings, which is critical to understand and predict interfacial failures under complex loadings. However, existing theoretical models have limitations on enough complexity and flexibility to well learn the real-world TSR from experimental observations. A neural network can fit well along with the loading paths but often fails to obey the laws of physics, due to a lack of experimental data and understanding of the hidden physical mechanism. In this paper, we propose a thermodynamic consistent neural network (TCNN) approach to build a data-driven model of the TSR with sparse experimental data. The TCNN leverages recent advances in physics-informed neural networks (PINN) that encode prior physical information into the loss function and efficiently train the neural networks using automatic differentiation. We investigate three thermodynamic consistent principles, i.e., positive energy dissipation, steepest energy dissipation gradient, and energy conservative loading path. All of them are mathematically formulated and embedded into a neural network model with a novel defined loss function. A real-world experiment demonstrates the superior performance of TCNN, and we find that TCNN provides an accurate prediction of the whole TSR surface and significantly reduces the violated prediction against the laws of physics.

Citations (7)

Summary

We haven't generated a summary for this paper yet.