Autoencoders for a manifold learning problem with a Jacobian rank constraint (2306.14194v1)
Abstract: We formulate the manifold learning problem as the problem of finding an operator that maps any point to a close neighbor that lies on a ``hidden'' $k$-dimensional manifold. We call this operator the correcting function. Under this formulation, autoencoders can be viewed as a tool to approximate the correcting function. Given an autoencoder whose Jacobian has rank $k$, we deduce from the classical Constant Rank Theorem that its range has a structure of a $k$-dimensional manifold. A $k$-dimensionality of the range can be forced by the architecture of an autoencoder (by fixing the dimension of the code space), or alternatively, by an additional constraint that the rank of the autoencoder mapping is not greater than $k$. This constraint is included in the objective function as a new term, namely a squared Ky-Fan $k$-antinorm of the Jacobian function. We claim that this constraint is a factor that effectively reduces the dimension of the range of an autoencoder, additionally to the reduction defined by the architecture. We also add a new curvature term into the objective. To conclude, we experimentally compare our approach with the CAE+H method on synthetic and real-world datasets.
- L. Cayton. Algorithms for manifold learning. Univ. of California at San Diego Tech. Rep, 2005.
- Deep Learning. MIT Press, 2016. http://www.deeplearningbook.org.
- Tangent prop - a formalism for specifying selected invariances in an adaptive network. In J. E. Moody, S. J. Hanson, and R. P. Lippmann, editors, Advances in Neural Information Processing Systems 4, pages 895–903. Morgan-Kaufmann, 1992.
- Nonlinear dimensionality reduction by locally linear embedding. SCIENCE, 290:2323–2326, 2000.
- A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319–2323, 2000.
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences, 100(10):5591–5596, 2003.
- Extracting and composing robust features with denoising autoencoders. In Proceedings of the 25th International Conference on Machine Learning, ICML ’08, page 1096–1103, New York, NY, USA, 2008. Association for Computing Machinery.
- Diffeomorphic dimensionality reduction. In D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, Advances in Neural Information Processing Systems 21, pages 1713–1720. Curran Associates, Inc., 2009.
- Laplacian eigenmaps and spectral techniques for embedding and clustering. In Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic, NIPS’01, page 585–591, Cambridge, MA, USA, 2001. MIT Press.
- Multimodal face-pose estimation with multitask manifold deep learning. IEEE Transactions on Industrial Informatics, 15(7):3952–3961, 2019.
- Multimodal deep autoencoder for human pose recovery. IEEE Transactions on Image Processing, 24(12):5659–5670, 2015.
- Hierarchical deep click feature prediction for fine-grained image recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(2):563–578, 2022.
- Manifold learning with structured subspace for multi-label feature selection. Pattern Recognition, 120:108169, 2021.
- A novel label enhancement algorithm based on manifold learning. Pattern Recognition, 135:109189, 2023.
- Mark A. Kramer. Nonlinear principal component analysis using autoassociative neural networks. AIChE Journal, 37(2):233–243, 1991.
- Reducing the dimensionality of data with neural networks. Science, 313(5786):504–507, 2006.
- The manifold tangent classifier. In J. Shawe-Taylor, R. S. Zemel, P. L. Bartlett, F. Pereira, and K. Q. Weinberger, editors, Advances in Neural Information Processing Systems 24, pages 2294–2302. Curran Associates, Inc., 2011.
- Higher order contractive auto-encoder. In Dimitrios Gunopulos, Thomas Hofmann, Donato Malerba, and Michalis Vazirgiannis, editors, Machine Learning and Knowledge Discovery in Databases, pages 645–660, Berlin, Heidelberg, 2011. Springer Berlin Heidelberg.
- J.M. Lee. Introduction to Smooth Manifolds. Graduate Texts in Mathematics. Springer New York, 2013.
- Stewart Dickson. Klein bottle graphic. https://library.wolfram.com/infocenter/MathSource/4560/.
- Computing the svd of a general matrix product/quotient. SIAM Journal on Matrix Analysis and Applications, 22(1):1–19, 2000.
- Searching for exotic particles in high-energy physics with deep learning. Nature communications, 5:4308, 2014.
- Jock A. Blackard. Comparison of Neural Networks and Discriminant Analysis in Predicting Forest Cover Types. PhD thesis, USA, 1998. AAI9921979.
- Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
- Alex Krizhevsky. Learning multiple layers of features from tiny images. pages 32–33, 2009.
- Reading digits in natural images with unsupervised feature learning. In NIPS Workshop on Deep Learning and Unsupervised Feature Learning 2011, 2011.
- An analysis of single-layer networks in unsupervised feature learning. In Geoffrey Gordon, David Dunson, and Miroslav Dudík, editors, Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, volume 15 of Proceedings of Machine Learning Research, pages 215–223, Fort Lauderdale, FL, USA, 11–13 Apr 2011. PMLR.
- Auto-encoding variational bayes, 2014.
- Learning mixed-curvature representations in product spaces. In International Conference on Learning Representations, 2019.
- Mixed-curvature variational autoencoders. In International Conference on Learning Representations, 2020.
- Geometry-aware hamiltonian variational auto-encoder, 2020.
- Topological autoencoders. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pages 7045–7054. PMLR, 13–18 Jul 2020.
- Witness autoencoder: Shaping the latent space with witness complexes. In NeurIPS 2020 Workshop on Topological Data Analysis and Beyond, 2020.
- Fast and accurate matrix completion via truncated nuclear norm regularization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(9):2117–2130, 2013.
- Partial sum minimization of singular values in robust pca: Algorithm and applications. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(4):744–758, 2016.
- A truncated nuclear norm regularization method based on weighted residual error for matrix completion. IEEE Transactions on Image Processing, 25(1):316–330, 2016.
- Online robust principal component analysis via truncated nuclear norm regularization. Neurocomputing, 175:216–222, 2016.