Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Local Homology of Word Embeddings (1810.10136v1)

Published 24 Oct 2018 in math.AT and cs.CL

Abstract: Topological data analysis (TDA) has been widely used to make progress on a number of problems. However, it seems that TDA application in NLP is at its infancy. In this paper we try to bridge the gap by arguing why TDA tools are a natural choice when it comes to analysing word embedding data. We describe a parallelisable unsupervised learning algorithm based on local homology of datapoints and show some experimental results on word embedding data. We see that local homology of datapoints in word embedding data contains some information that can potentially be used to solve the word sense disambiguation problem.

Citations (5)

Summary

We haven't generated a summary for this paper yet.