Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
4 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Nonparametric Topological Layers in Neural Networks (2111.14829v1)

Published 27 Nov 2021 in cs.LG and stat.ML

Abstract: Various topological techniques and tools have been applied to neural networks in terms of network complexity, explainability, and performance. One fundamental assumption of this line of research is the existence of a global (Euclidean) coordinate system upon which the topological layer is constructed. Despite promising results, such a \textit{topologization} method has yet to be widely adopted because the parametrization of a topologization layer takes a considerable amount of time and more importantly, lacks a theoretical foundation without which the performance of the neural network only achieves suboptimal performance. This paper proposes a learnable topological layer for neural networks without requiring a Euclidean space; Instead, the proposed construction requires nothing more than a general metric space except for an inner product, i.e., a Hilbert space. Accordingly, the according parametrization for the proposed topological layer is free of user-specified hyperparameters, which precludes the costly parametrization stage and the corresponding possibility of suboptimal networks.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.