Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Greedy construction of quadratic manifolds for nonlinear dimensionality reduction and nonlinear model reduction (2403.06732v2)

Published 11 Mar 2024 in math.NA and cs.NA

Abstract: Dimensionality reduction on quadratic manifolds augments linear approximations with quadratic correction terms. Previous works rely on linear approximations given by projections onto the first few leading principal components of the training data; however, linear approximations in subspaces spanned by the leading principal components alone can miss information that are necessary for the quadratic correction terms to be efficient. In this work, we propose a greedy method that constructs subspaces from leading as well as later principal components so that the corresponding linear approximations can be corrected most efficiently with quadratic terms. Properties of the greedily constructed manifolds allow applying linear algebra reformulations so that the greedy method scales to data points with millions of dimensions. Numerical experiments demonstrate that an orders of magnitude higher accuracy is achieved with the greedily constructed quadratic manifolds compared to manifolds that are based on the leading principal components alone.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Interpolatory Methods for Model Reduction. SIAM, Philadelphia, 2020.
  2. J. Barnett and C. Farhat. Quadratic approximation manifold for mitigating the Kolmogorov barrier in nonlinear projection-based model order reduction. J. Comput. Phys., 464:111348, 2022.
  3. Neural-network-augmented projection-based model order reduction for mitigating the Kolmogorov barrier to reducibility. J. Comput. Phys., 492:112420, 2023.
  4. M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373 – 1396, 2003. Cited by: 5995; All Open Access, Green Open Access.
  5. P. Benner and T. Breiten. Two-sided projection methods for nonlinear model order reduction. SIAM Journal on Scientific Computing, 37(2):B239–B260, 2015.
  6. ℋ2subscriptℋ2\mathcal{H}_{2}caligraphic_H start_POSTSUBSCRIPT 2 end_POSTSUBSCRIPT-quasi-optimal model order reduction for quadratic-bilinear control systems. SIAM Journal on Matrix Analysis and Applications, 39(2):983–1032, 2018.
  7. A quadratic decoder approach to nonintrusive reduced-order modeling of nonlinear dynamical systems. PAMM, 23(1):e202200049, 2023.
  8. A survey of projection-based model reduction methods for parametric dynamical systems. SIAM Rev., 57(4):483–531, 2015.
  9. AMR-Wind: Adaptive mesh-refinement for atmospheric-boundary-layer wind energy simulations. In APS Division of Fluid Dynamics Meeting Abstracts, APS Meeting Abstracts, page T29.007, 2021.
  10. Robust principal component analysis? J. ACM, 58(3), jun 2011.
  11. The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent. Math. Program., 155(1–2):57–79, 2014.
  12. Nonlinear compressive reduced basis approximation for PDE’s. working paper or preprint, 2023.
  13. Intermodal energy transfers in a proper orthogonal decomposition–Galerkin representation of a turbulent separated flow. Journal of Fluid Mechanics, 491:275–284, 2003.
  14. D. L. Donoho and C. Grimes. Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences, 100(10):5591–5596, 2003.
  15. Turbulence modeling in the age of data. Annual Review of Fluid Mechanics, 51(1):357–377, 2019.
  16. Learning latent representations in high-dimensional state spaces using polynomial manifold constructions, 2023.
  17. Learning physics-based reduced-order models from data using nonlinear manifolds, 2023.
  18. Operator inference for non-intrusive model reduction with quadratic manifolds. Comput. Methods Appl. Mech. Engrg., 403:115717, 2023.
  19. A priori estimation of memory effects in reduced-order models of nonlinear systems using the Mori-Zwanzig formalism. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 473(2205):20170385, 2017.
  20. P. Goyal and P. Benner. Generalized quadratic embeddings for nonlinear dynamics using deep learning. arXiv, 2211.00357, 2024.
  21. Guaranteed stable quadratic models and their applications in SINDy and operator inference. arXiv, 2308.13819, 2024.
  22. C. Gu. Qlmor: A projection-based nonlinear model order reduction approach using quadratic-linear representation of nonlinear systems. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 30(9):1307–1320, 2011.
  23. Reducing the dimensionality of data with neural networks. Science, 313(5786):504–507, 2006.
  24. A quadratic manifold for model order reduction of nonlinear structural dynamics. Computers & Structures, 188:80–94, 2017.
  25. Promoting global stability in data-driven models of quadratic nonlinear dynamics. Phys. Rev. Fluids, 6:094401, Sep 2021.
  26. Learning nonlinear reduced models from data with operator inference. Annual Review of Fluid Mechanics, 56(1):521–548, 2024.
  27. Dynamic mode decomposition: data-driven modeling of complex systems. SIAM, 2016.
  28. S. Pan and K. Duraisamy. Data-driven discovery of closure models. SIAM Journal on Applied Dynamical Systems, 17(4):2381–2413, 2018.
  29. B. Peherstorfer. Breaking the Kolmogorov barrier with nonlinear model reduction. Notices of the American Mathematical Society, 69(5):725–733, 2022.
  30. B. Peherstorfer and K. Willcox. Data-driven operator inference for nonintrusive projection-based model reduction. Computer Methods in Applied Mechanics and Engineering, 306:196–215, 2016.
  31. Lift & learn: Physics-informed machine learning for large-scale nonlinear dynamical systems. Physica D: Nonlinear Phenomena, 406:132401, 2020.
  32. An explicit nonlinear mapping for manifold learning. IEEE Transactions on Cybernetics, 43(1):51 – 63, 2013. Cited by: 74; All Open Access, Green Open Access.
  33. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323–2326, 2000.
  34. Spectral analysis of nonlinear flows. Journal of Fluid Mechanics, 641:115–127, 2009.
  35. Reduced basis approximation and a posteriori error estimation for affinely parametrized elliptic coercive partial differential equations. Arch. Comput. Methods Eng., 15(3):229–275, 2008.
  36. Generalization of quadratic manifolds for reduced order modeling of nonlinear structural dynamics. Computers & Structures, 192:196–209, 2017.
  37. P. Sagaut. Large Eddy Simulation for Incompressible Flows: An Introduction. Springer-Verlag, 2006.
  38. Physics-informed regularization and structure preservation for learning stable reduced models from data with operator inference. Computer Methods in Applied Mechanics and Engineering, 404:115836, 2023.
  39. Robust principal component analysis for modal decomposition of corrupt fluid flows. Phys. Rev. Fluids, 5:054401, May 2020.
  40. M. Schlegel and B. R. Noack. On long-term boundedness of Galerkin models. Journal of Fluid Mechanics, 765:325–352, 2015.
  41. P. J. Schmid. Dynamic mode decomposition of numerical and experimental data. Fluid Mech., 656:5–28, 2010.
  42. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(5):1299 – 1319, 1998. Cited by: 6579; All Open Access, Green Open Access.
  43. Symplectic model reduction of Hamiltonian systems using data-driven quadratic manifolds. Computer Methods in Applied Mechanics and Engineering, 417:116402, 2023.
  44. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319–2323, 2000.
  45. On dynamic mode decomposition: Theory and applications. Journal of Computational Dynamics, 1(2):391–421, 2014.
  46. W. I. T. Uy and B. Peherstorfer. Operator inference of non-Markovian terms for learning reduced models from partially observed state trajectories. Journal of Scientific Computing, 88(3):91, Aug 2021.
  47. L. van der Maaten and G. Hinton. Visualizing data using t-sne. Journal of Machine Learning Research, 9(86):2579–2605, 2008.
  48. Proper orthogonal decomposition closure models for turbulent flows: A numerical comparison. Computer Methods in Applied Mechanics and Engineering, 237-240:10–26, 2012.
  49. Data-driven filtered reduced order modeling of fluid flows. SIAM Journal on Scientific Computing, 40(3):B834–B857, 2018.
  50. Data-driven identification of quadratic representations for nonlinear Hamiltonian systems using weakly symplectic liftings. arXiv, 2308.01084, 2024.
  51. L. Zanna and T. Bolton. Data-driven equation discovery of ocean mesoscale closures. Geophysical Research Letters, 47(17):e2020GL088376, 2020.
Citations (6)

Summary

  • The paper introduces a novel greedy algorithm for constructing quadratic manifolds to enhance nonlinear dimensionality reduction beyond linear PCA.
  • The greedy method strategically selects subspaces, including later principal components, to significantly improve accuracy by integrating effective quadratic correction terms.
  • Numerical experiments demonstrate that this approach achieves orders of magnitude better accuracy and superior efficiency compared to traditional PCA and alternating minimization techniques.

Greedy Construction of Quadratic Manifolds for Nonlinear Dimensionality Reduction

The paper authored by Paul Schwerdtner, Benjamin Peherstorfer, and March presents a novel approach to dimensionality reduction through the construction of quadratic manifolds, overcoming limitations associated with linear approximations based solely on principal component analysis (PCA). The authors put forward a greedy algorithm that leverages both leading and later principal components to more effectively augment linear approximations with quadratic correction terms, showcasing considerable improvements in accuracy over existing methods.

The central idea involves constructing quadratic manifolds using a greedy selection process that efficiently integrates nonlinear correction terms into the linear model reduction framework. This addresses shortcomings associated with traditional approaches that rely heavily on PCA. Specifically, those approaches often miss pertinent information, especially in datasets characterized by nonlinear correlations, which undermines the efficacy of quadratic correction terms.

Methodology

The greedy algorithm proposed strategically selects subspaces not confined to the leading principal components. It aims to enhance the quadratic correction applied to the linear approximations. In each iteration, the algorithm selects a basis vector from the leading up to the m-th principal component, optimizing the reduction of approximation error when combined with quadratic corrections.

This approach allows the quadratic manifolds to capture more complex data structures than linear subspaces derived strictly from PCA. In the experimental section, numerical tests demonstrated significant reductions in error. This approach achieved orders of magnitude better accuracy, particularly in high-dimensional datasets such as those representing transport phenomena.

Numerical Experiments

The paper includes extensive numerical experiments on datasets representing physics applications, nonlinear advection-diffusion processes, Hamiltonian pulse signals, and turbulent flow data, providing evidence of the algorithm’s efficacy. For instance, the greedy method attained accuracy enhancements by factors of three to five magnitudes over PCA-alone methods.

The experiments also highlighted the scalable nature of the method, as it performed efficiently on datasets with millions of dimensions. The authors showcased the algorithm's superior convergence properties and computational efficiency, underlining the methodological advantage over traditional alternating minimization techniques, which the authors note become computationally intractable for high-dimensional data.

Implications and Future Directions

The implications of this work span both theoretical and practical domains. Theoretically, it challenges the convention of strictly linear dimensionality reduction in scenarios where nonlinear patterns dominate. Practically, this method offers a pathway to integrate sophisticated nonlinear corrections in reduced-order models, pertinent in fields like fluid dynamics, structural mechanics, and broader scientific computing.

The greedy construction of quadratic manifolds provides a refined toolkit for scientists and engineers striving for more accurate, computationally feasible models in complex systems. Looking forward, this work opens avenues for extensions, potentially integrating higher-order polynomials or hybridizing with machine learning models to further enhance the modeling of nonlinear dynamics.

In summary, the contribution enriches the dimensionality reduction toolkit with a method that negotiates the critical balance between computational efficiency and modeling accuracy, particularly in the face of intricate, high-dimensional datasets. For researchers and practitioners engaged in areas with inherent nonlinearities, this paper presents an insightful reference point and a forward-thinking methodology.

Youtube Logo Streamline Icon: https://streamlinehq.com