2000 character limit reached
Predicting the Semantic Textual Similarity with Siamese CNN and LSTM (1810.10641v1)
Published 24 Oct 2018 in cs.CL
Abstract: Semantic Textual Similarity (STS) is the basis of many applications in NLP. Our system combines convolution and recurrent neural networks to measure the semantic similarity of sentences. It uses a convolution network to take account of the local context of words and an LSTM to consider the global context of sentences. This combination of networks helps to preserve the relevant information of sentences and improves the calculation of the similarity between sentences. Our model has achieved good results and is competitive with the best state-of-the-art systems.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.