2000 character limit reached
Word Embeddings Are Capable of Capturing Rhythmic Similarity of Words (2204.04833v2)
Published 11 Apr 2022 in cs.CL and cs.LG
Abstract: Word embedding systems such as Word2Vec and GloVe are well-known in deep learning approaches to NLP. This is largely due to their ability to capture semantic relationships between words. In this work we investigated their usefulness in capturing rhythmic similarity of words instead. The results show that vectors these embeddings assign to rhyming words are more similar to each other, compared to the other words. It is also revealed that GloVe performs relatively better than Word2Vec in this regard. We also proposed a first of its kind metric for quantifying rhythmic similarity of a pair of words.