A Distribution-Based Threshold for Determining Sentence Similarity (2311.16675v1)
Abstract: We hereby present a solution to a semantic textual similarity (STS) problem in which it is necessary to match two sentences containing, as the only distinguishing factor, highly specific information (such as names, addresses, identification codes), and from which we need to derive a definition for when they are similar and when they are not. The solution revolves around the use of a neural network, based on the siamese architecture, to create the distributions of the distances between similar and dissimilar pairs of sentences. The goal of these distributions is to find a discriminating factor, that we call "threshold", which represents a well-defined quantity that can be used to distinguish vector distances of similar pairs from vector distances of dissimilar pairs in new predictions and later analyses. In addition, we developed a way to score the predictions by combining attributes from both the distributions' features and the way the distance function works. Finally, we generalize the results showing that they can be transferred to a wider range of domains by applying the system discussed to a well-known and widely used benchmark dataset for STS problems.
- A sick cure for the evaluation of compositional distributional semantic models, 2014. URL http://www.lrec-conf.org/proceedings/lrec2014/pdf/363_Paper.pdf.
- Semeval-2017 task 1: Semantic textual similarity-multilingual and cross-lingual focused evaluation. arXiv preprint arXiv:1708.00055v1, 2017. URL https://arxiv.org/abs/1708.00055v1. Version 1.
- Multilingual universal sentence encoder for semantic retrieval. arXiv preprint arXiv:1907.04307, 2019. doi:https://doi.org/10.48550/arXiv.1907.04307. URL https://arxiv.org/abs/1907.04307v1. Version 1.
- Attention is all you need. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf.
- Yoon Kim. Convolutional neural networks for sentence classification, October 2014. URL https://aclanthology.org/D14-1181.
- Universal sentence encoder. arXiv preprint arXiv:1803.11175v2, 2018. URL https://arxiv.org/abs/1803.11175v2. Version 2.
- Distributed representations of words and phrases and their compositionality. In C.J. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K.Q. Weinberger, editors, Advances in Neural Information Processing Systems, volume 26. Curran Associates, Inc., 2013. URL https://proceedings.neurips.cc/paper/2013/file/9aa42b31882ec039965f3c4923ce901b-Paper.pdf.
- GloVe: Global vectors for word representation, October 2014. URL https://aclanthology.org/D14-1162.
- A large annotated corpus for learning natural language inference, September 2015. URL https://aclanthology.org/D15-1075.
- Siamese recurrent architectures for learning sentence similarity, 2016. URL https://doi.org/10.1609/aaai.v30i1.10350.
- Sentence-bert: Sentence embeddings using siamese bert-networks. arXiv preprint arXiv:1908.10084, 2019. doi:10.48550/arXiv.1908.10084. URL https://arxiv.org/abs/1908.10084v1. Version 1.
- On the difficulty of training recurrent neural networks. In Sanjoy Dasgupta and David McAllester, editors, Proceedings of the 30th International Conference on Machine Learning, volume 28 of Proceedings of Machine Learning Research, pages 1310–1318, Atlanta, Georgia, USA, 17–19 Jun 2013. URL https://proceedings.mlr.press/v28/pascanu13.html.
- SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods, 17:261–272, 2020. doi:10.1038/s41592-019-0686-2.