Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Betti numbers of attention graphs is all you really need (2207.01903v1)

Published 5 Jul 2022 in cs.CL

Abstract: We apply methods of topological analysis to the attention graphs, calculated on the attention heads of the BERT model ( arXiv:1810.04805v2 ). Our research shows that the classifier built upon basic persistent topological features (namely, Betti numbers) of the trained neural network can achieve classification results on par with the conventional classification method. We show the relevance of such topological text representation on three text classification benchmarks. For the best of our knowledge, it is the first attempt to analyze the topology of an attention-based neural network, widely used for Natural Language Processing.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Laida Kushnareva (12 papers)
  2. Dmitri Piontkovski (23 papers)
  3. Irina Piontkovskaya (24 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.