Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compositional Generative Inverse Design (2401.13171v2)

Published 24 Jan 2024 in cs.LG, cs.AI, and cs.CE

Abstract: Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem that arises across fields such as mechanical engineering to aerospace engineering. Inverse design is typically formulated as an optimization problem, with recent works leveraging optimization across learned dynamics models. However, as models are optimized they tend to fall into adversarial modes, preventing effective sampling. We illustrate that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples and significantly improve design performance. We further illustrate how such a design system is compositional, enabling us to combine multiple different diffusion models representing subcomponents of our desired system to design systems with every specified component. In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes that are more complex than those in the training data. Our method generalizes to more objects for N-body dataset and discovers formation flying to minimize drag in the multi-airfoil design task. Project website and code can be found at https://github.com/AI4Science-WestlakeU/cindm.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. Compositional foundation models for hierarchical planning. arXiv preprint arXiv:2309.08587, 2023.
  2. Inverse design for fluid-structure interactions using graph network simulators. In Advances in Neural Information Processing Systems, volume 35, pp.  13759–13774. Curran Associates, Inc., 2022.
  3. W Kyle Anderson and V Venkatakrishnan. Aerodynamic design optimization on unstructured grids with a continuous adjoint formulation. Computers & Fluids, 28(4-5):443–480, 1999.
  4. Autoinverse: Uncertainty aware inversion of neural networks. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho (eds.), Advances in Neural Information Processing Systems, 2022. URL https://openreview.net/forum?id=dNyCj1AbOb.
  5. Parametric design of aircraft geometry using partial differential equations. Advances in Engineering Software, 40(7):479–486, 2009.
  6. A perspective on inverse design of battery interphases using multi-scale modelling, experiments and generative deep learning. Energy Storage Materials, 21:446–456, 2019.
  7. Victor Blomqvist. Pymunk tutorials, 2007. URL http://www.pymunk.org/en/latest/tutorials.html.
  8. Message passing neural PDE solvers. In International Conference on Learning Representations, 2022. URL https://openreview.net/forum?id=vSix3HPYKSU.
  9. Bidirectional learning for offline model-based biological sequence design, 2023. URL https://openreview.net/forum?id=luEG3j9LW5-.
  10. Healing products of gaussian process experts. In International Conference on Machine Learning, pp.  2068–2077. PMLR, 2020.
  11. Computational design of mechanical characters. ACM Transactions on Graphics (TOG), 32(4):1–12, 2013.
  12. From predictive modelling to machine learning and reverse engineering of colloidal self-assembly. Nature materials, 20(6):762–773, 2021.
  13. Functional optimization of fluidic devices with differentiable stokes flow. ACM Transactions on Graphics (TOG), 39(6):1–15, 2020a.
  14. Implicit generation and generalization in energy-based models. arXiv preprint arXiv:1903.08689, 2019.
  15. Model based planning with energy based models. CORL, 2019.
  16. Compositional visual generation with energy based models. In Advances in Neural Information Processing Systems, 2020b.
  17. Reduce, reuse, recycle: Compositional generation with energy-based diffusion models and mcmc. arXiv preprint arXiv:2302.11552, 2023.
  18. Energy-based models as zero-shot planners for compositional scene rearrangement. arXiv preprint arXiv:2304.14391, 2023.
  19. Identifiability of product of experts models. arXiv preprint arXiv:2310.09397, 2023.
  20. Geoffrey E Hinton. Training products of experts by minimizing contrastive divergence. Neural computation, 14(8):1771–1800, 2002.
  21. Dietrich Hummel. Formation flight as an energy-saving mechanism. Israel Journal of Ecology and Evolution, 41(3):261–278, 1995.
  22. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  23. A tutorial on energy-based learning. Predicting structured data, 1(0), 2006.
  24. Pre-trained language models for interactive decision-making. Advances in Neural Information Processing Systems, 35:31199–31212, 2022.
  25. Fourier neural operator for parametric partial differential equations. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=c8P9NQVtmnO.
  26. Formation flight of birds. Science, 168(3934):1003–1005, 1970.
  27. Learning to compose visual relations. Advances in Neural Information Processing Systems, 34:23166–23178, 2021.
  28. Compositional visual generation with composable diffusion models. arXiv preprint arXiv:2206.01714, 2022.
  29. Inverse design in nanophotonics. Nature Photonics, 12(11):659–670, 2018.
  30. Controllable and compositional generation with latent-space energy-based models. Advances in Neural Information Processing Systems, 34, 2021.
  31. Compositional 3d scene generation using locally conditioned diffusion. arXiv preprint arXiv:2303.12218, 2023.
  32. Benchmarking deep inverse models over time, and the neural-adjoint method. In H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin (eds.), Advances in Neural Information Processing Systems, volume 33, pp.  38–48. Curran Associates, Inc., 2020. URL https://proceedings.neurips.cc/paper_files/paper/2020/file/007ff380ee5ac49ffc34442f5c2a2b86-Paper.pdf.
  33. U-net: Convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5-9, 2015, Proceedings, Part III 18, pp.  234–241. Springer, 2015.
  34. The cross-entropy method: a unified approach to combinatorial optimization, Monte-Carlo simulation, and machine learning, volume 133. Springer, 2004.
  35. Optimal design of nose and tail of an autonomous underwater vehicle hull to reduce drag force using numerical simulation. Proceedings of the Institution of Mechanical Engineers, Part M: Journal of Engineering for the Maritime Environment, 234(1):76–88, 2020.
  36. Learning to simulate complex physics with graph networks. In International Conference on Machine Learning, pp.  8459–8468. PMLR, 2020.
  37. Pete Shinners. Pygame: A set of Python modules designed for writing video games, 2000. URL https://www.pygame.org/.
  38. Heteroscedastic bayesian optimization using generalized product of experts. Journal of Global Optimization, pp.  1–21, 2023.
  39. Design-bench: Benchmarks for data-driven offline model-based optimization, 2021. URL https://openreview.net/forum?id=cQzf26aA3vM.
  40. Composable energy policies for reactive motion generation and reinforcement learning. arXiv preprint arXiv:2105.04962, 2021.
  41. Vortex effect modelling in aircraft formation flight. In AIAA atmospheric flight mechanics conference and exhibit, pp.  5385, 2003.
  42. Concept algebra for text-controlled vision models. arXiv preprint arXiv:2302.03693, 2023.
  43. Gabriel D Weymouth. Lily pad: Towards real-time interactive computational fluid dynamics. arXiv preprint arXiv:1510.06886, 2015.
  44. Learning to accelerate partial differential equations via latent global evolution. Advances in Neural Information Processing Systems, 35:2240–2253, 2022a.
  45. Zeroc: A neuro-symbolic model for zero-shot concept recognition and acquisition at inference time. Advances in Neural Information Processing Systems, 35:9828–9840, 2022b.
  46. Probabilistic adaptation of text-to-video models. arXiv preprint arXiv:2306.01872, 2023a.
  47. Compositional diffusion-based continuous constraint solvers. arXiv preprint arXiv:2309.00966, 2023b.
  48. Artificial intelligence for science in quantum, atomistic, and continuum systems. arXiv preprint arXiv:2307.08423, 2023.
  49. Learning to solve PDE-constrained inverse problems with graph networks. In ICML, 2022.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets