Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Query Topic Models via Pseudo-Relevant Pólya Document Models (1602.01665v1)

Published 4 Feb 2016 in cs.IR

Abstract: Query-expansion via pseudo-relevance feedback is a popular method of overcoming the problem of vocabulary mismatch and of increasing average retrieval effectiveness. In this paper, we develop a new method that estimates a query topic model from a set of pseudo-relevant documents using a new LLMling framework. We assume that documents are generated via a mixture of multivariate Polya distributions, and we show that by identifying the topical terms in each document, we can appropriately select terms that are likely to belong to the query topic model. The results of experiments on several TREC collections show that the new approach compares favourably to current state-of-the-art expansion methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Ronan Cummins (5 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.