Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep energy method in topology optimization applications (2207.03072v1)

Published 7 Jul 2022 in cs.CE

Abstract: This paper explores the possibilities of applying physics-informed neural networks (PINNs) in topology optimization (TO) by introducing a fully self-supervised TO framework that is based on PINNs. This framework solves the forward elasticity problem by the deep energy method (DEM). Instead of training a separate neural network to update the density distribution, we leverage the fact that the compliance minimization problem is self-adjoint to express the element sensitivity directly in terms of the displacement field from the DEM model, and thus no additional neural network is needed for the inverse problem. The method of moving asymptotes is used as the optimizer for updating density distribution. The implementation of Neumann, Dirichlet, and periodic boundary conditions are described in the context of the DEM model. Three numerical examples are presented to demonstrate framework capabilities: (1) Compliance minimization in 2D under different geometries and loading, (2) Compliance minimization in 3D, and (3) Maximization of homogenized shear modulus to design 2D meta material unit cells. The results show that the optimized designs from the DEM-based framework are very comparable to those generated by the finite element method, and shed light on a new way of integrating PINN-based simulation methods into classical computational mechanics problems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Junyan He (15 papers)
  2. Shashank Kushwaha (9 papers)
  3. Charul Chadha (1 paper)
  4. Seid Koric (25 papers)
  5. Diab Abueidda (20 papers)
  6. Iwona Jasiuk (13 papers)
Citations (27)

Summary

We haven't generated a summary for this paper yet.