Papers
Topics
Authors
Recent
2000 character limit reached

Polytopic Autoencoders with Smooth Clustering for Reduced-order Modelling of Flows (2401.10620v1)

Published 19 Jan 2024 in cs.LG, cs.CV, and math.DS

Abstract: With the advancement of neural networks, there has been a notable increase, both in terms of quantity and variety, in research publications concerning the application of autoencoders to reduced-order models. We propose a polytopic autoencoder architecture that includes a lightweight nonlinear encoder, a convex combination decoder, and a smooth clustering network. Supported by several proofs, the model architecture ensures that all reconstructed states lie within a polytope, accompanied by a metric indicating the quality of the constructed polytopes, referred to as polytope error. Additionally, it offers a minimal number of convex coordinates for polytopic linear-parameter varying systems while achieving acceptable reconstruction errors compared to proper orthogonal decomposition (POD). To validate our proposed model, we conduct simulations involving two flow scenarios with the incompressible Navier-Stokes equation. Numerical results demonstrate the guaranteed properties of the model, low reconstruction errors compared to POD, and the improvement in error using a clustering network.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. R. Altmann and J. Heiland. Finite element decomposition and minimal extension for flow equations. ESAIM Math. Model. Numer. Anal., 49(5):1489–1509, September 2015. doi:10.1051/m2an/2015029.
  2. D. Amsallem and B. Haasdonk. PEBL-ROM: Projection-error based local reduced-order models. Advanced Modeling and Simulation in Engineering Sciences, 3(1), Mar. 2016. doi:10.1186/s40323-016-0059-7.
  3. Nonlinear model order reduction based on local reduced-order bases. Int. J. Numer. Methods Eng., 92(10):891–916, 2012. doi:doi.org/10.1002/nme.4371.
  4. Interior-point methods for large-scale cone programming. Optimization for Machine Learning, 2011. doi:10.7551/mitpress/8996.003.0005.
  5. Self-scheduled H∞{}_{\mbox{{$\infty$}}}start_FLOATSUBSCRIPT ∞ end_FLOATSUBSCRIPT control of linear parameter-varying systems: a design example. Autom., 31(9):1251–1261, 1995. doi:10.1016/0005-1098(95)00038-X.
  6. D. Arthur and S. Vassilvitskii. K-means++: The advantages of careful seeding. In Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA ’07, pages 1027–1035, USA, 2007. Society for Industrial and Applied Mathematics. doi:10.5555/1283383.1283494.
  7. Autoencoders. arXiv, 2020. URL: https://arxiv.org/abs/2003.05991.
  8. Example setups of Navier-Stokes equations with control and observation: Spatial discretization and representation via linear-quadratic matrix coefficients. ArXiv, 2017. URL: https://arxiv.org/abs/1707.08711.
  9. The proper orthogonal decomposition in the analysis of turbulent flows. Annu. Rev. Fluid Mech., 25(1):539–575, 1993. doi:10.1146/annurev.fl.25.010193.002543.
  10. Symplectic model reduction of hamiltonian systems on nonlinear manifolds and approximation with weakly symplectic autoencoder. SIAM J. Sci. Comput., 45(2):A289–A311, 2023. doi:10.1137/21M1466657.
  11. Machine-learning based model order reduction of a biomechanical model of the human tongue. Comput. Methods. Programs. Biomed., 198:105786, 2021. doi:10.1016/j.cmpb.2020.105786.
  12. DKM: differentiable k-means clustering layer for neural network compression. In The 10th ICLR 2022. OpenReview.net, 2022. URL: https://openreview.net/forum?id=J_F_qqCE3Z5.
  13. Fast and accurate deep network learning by exponential linear units (ELUs). ICLR 2016 (Poster), 2016. URL: https://arxiv.org/pdf/1511.07289.pdf.
  14. A. Das and J. Heiland. Low-order linear parameter varying approximations for nonlinear controller design for flows. Technical report, arXiv, 2023. submitted to ECC2024. URL: https://arxiv.org/abs/2311.05305.
  15. J. Duan and J. S. Hesthaven. Non-intrusive data-driven reduced-order modeling for time-dependent parametrized problems. J. Comput. Phys., 497:112621, 2024. doi:10.1016/j.jcp.2023.112621.
  16. J. C. Dunn. A fuzzy relative of the isodata process and its use in detecting compact well-separated clusters. Journal of Cybernetics, 3(3):32–57, 1973. doi:10.1080/01969727308546046.
  17. Surrogate modeling of aerodynamic simulations for multiple operating conditions using machine learning. AIAA Journal, 56(9):3622–3635, 2018. doi:10.2514/1.J056405.
  18. Deep neural networks for nonlinear model order reduction of unsteady flows. Phys. Fluids, 32(10):105104, 2020. doi:10.1063/5.0020526.
  19. Deep k-means: Jointly clustering with k-means and learning representations. Pattern Recognit. Lett., 138:185–192, 2020. doi:10.1016/J.PATREC.2020.07.028.
  20. S. Fresca and A. Manzoni. POD-DL-ROM: Enhancing deep learning-based reduced order models for nonlinear parametrized PDEs by proper orthogonal decomposition. Comput. Methods Appl. Mech. Engrg., 388:114181, 2022. doi:10.1016/j.cma.2021.114181.
  21. Differentiable deep clustering with cluster size constraints. arXiv, 2019. URL: http://arxiv.org/abs/1910.09036.
  22. J. C. Geromel and P. Colaneri. Robust stability of time varying polytopic systems. Syst. Control. Lett., 55(1):81–85, 2006. doi:10.1016/J.SYSCONLE.2004.11.016.
  23. Deep Learning. MIT Press, Cambridge, MA, USA, 2016. URL: http://www.deeplearningbook.org.
  24. Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev., 53(2):217–288, 2011. doi:10.1137/090771806.
  25. S. M. Hashemi and H. Werner. LPV modelling and control of Burgers’ equation. IFAC Proceedings Volumes, 44(1):5430–5435, 2011. doi:10.3182/20110828-6-IT-1002.03318.
  26. Convolutional neural networks for very low-dimensional LPV approximations of incompressible Navier-Stokes equations. Frontiers Appl. Math. Stat., 8:879140, 2022. doi:10.3389/fams.2022.879140.
  27. J. Heiland and Y. Kim. Convolutional autoencoders and clustering for low-dimensional parametrization of incompressible flows. IFAC-PapersOnLine, 55(30):430–435, 2022. Proceedings of the 25th International Symposium on MTNS. doi:10.1016/j.ifacol.2022.11.091.
  28. Local non-intrusive reduced order modeling based on soft clustering and classification algorithm. Int. J. Numer. Methods Eng., 123(10):2237–2261, 2022. doi:10.1002/nme.6934.
  29. D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. ICLR 2015, Conference Track Proceedings, 2015. URL: http://arxiv.org/abs/1412.6980.
  30. D. P. Kingma and M. Welling. Auto-encoding variational bayes. In 2nd ICLR 2014, Conference Track Proceedings, 2014. URL: http://arxiv.org/abs/1312.6114.
  31. P. J. W. Koelewijn and R. Tóth. Scheduling dimension reduction of LPV models - A deep neural network approach. Proceedings of the IEEE, pages 1111–1117, 2020. doi:10.23919/ACC45564.2020.9147310.
  32. A. Kwiatkowski and H. Werner. PCA-based parameter set mappings for LPV models with fewer parameters and less overbounding. IEEE Trans. Control. Syst. Technol., 16(4):781–788, 2008. doi:10.1109/TCST.2007.903094.
  33. A model order reduction approach to create patient-specific mechanical models of human liver in computational medicine applications. Comput. Methods. Programs. Biomed., 170:95–106, 2019. doi:10.1016/j.cmpb.2019.01.003.
  34. K. Lee and K. T. Carlberg. Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders. J. Comput. Phys., 404, 2020. doi:10.1016/J.JCP.2019.108973.
  35. Methods of model order reduction for coupled systems applied to a brake disc-wheel composite. PAMM, 22(1):e202200323, 2023. doi:10.1002/pamm.202200323.
  36. Reduced-order modeling of advection-dominated systems with recurrent neural networks and convolutional autoencoders. Phys. Fluids, 33(3):037106, 03 2021. doi:10.1063/5.0039986.
  37. POD-DEIM model order reduction technique for model predictive control in continuous chemical processing. Comput. Chem. Eng., 133:106638, 2020. doi:10.1016/j.compchemeng.2019.106638.
  38. M. Ohlberger and S. Rave. Reduced basis methods: Success, limitations and future challenges. Proceedings of the Conference Algoritmy, pages 1–12, 2016. URL: https://arxiv.org/abs/1511.02021.
  39. An autoencoder-based reduced-order model for eigenvalue problems with application to neutron diffusion. Int. J. Numer. Methods Eng., 122(15):3780–3811, 2021. doi:10.1002/nme.6681.
  40. A graph convolutional autoencoder approach to model order reduction for parametrized PDEs, 2023. URL: http://arxiv.org/abs/2305.08573.
  41. Model reduction in linear parameter-varying models using autoencoder neural networks. IEEE, pages 6415–6420, 2018. doi:10.23919/ACC.2018.8431912.
  42. MobileNetV2: Inverted residuals and linear bottlenecks. In 2018 IEEE Conference on CVPR 2018, pages 4510–4520, 2018. doi:10.1109/CVPR.2018.00474.
  43. D. Sculley. Web-scale k-means clustering. WWW’10, pages 1177–1178, 2010. doi:10.1145/1772690.1772862.
  44. Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res., 15(1):1929–1958, 2014. doi:10.5555/2627435.2670313.
  45. Reduced order modeling of deformable tire-soil interaction with proper orthogonal decomposition. J. Comput. Nonlinear Dyn., 17(5):051009, 03 2022. doi:10.1115/1.4053592.
  46. Rethinking the inception architecture for computer vision. In 2016 IEEE Conference on CVPR 2016, pages 2818–2826. IEEE Computer Society, 2016. doi:10.1109/CVPR.2016.308.
  47. Linear parameter-varying approach for modeling rapid thermal processes. In 2016 ACC, pages 3243–3248, 2016. doi:10.1109/ACC.2016.7525417.
  48. X. Zhang and L. Jiang. Conditional variational autoencoder with Gaussian process regression recognition for parametric models. J. Comput. Appl. Math., 438:115532, 2024. doi:10.1016/j.cam.2023.115532.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.