Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Match-Tensor: a Deep Relevance Model for Search (1701.07795v1)

Published 26 Jan 2017 in cs.IR and cs.CL

Abstract: The application of Deep Neural Networks for ranking in search engines may obviate the need for the extensive feature engineering common to current learning-to-rank methods. However, we show that combining simple relevance matching features like BM25 with existing Deep Neural Net models often substantially improves the accuracy of these models, indicating that they do not capture essential local relevance matching signals. We describe a novel deep Recurrent Neural Net-based model that we call Match-Tensor. The architecture of the Match-Tensor model simultaneously accounts for both local relevance matching and global topicality signals allowing for a rich interplay between them when computing the relevance of a document to a query. On a large held-out test set consisting of social media documents, we demonstrate not only that Match-Tensor outperforms BM25 and other classes of DNNs but also that it largely subsumes signals present in these models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Aaron Jaech (15 papers)
  2. Hetunandan Kamisetty (1 paper)
  3. Eric Ringger (2 papers)
  4. Charlie Clarke (1 paper)
Citations (22)

Summary

We haven't generated a summary for this paper yet.