Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bsmooth: Learning from user feedback to disambiguate query terms in interactive data retrieval (1610.04789v3)

Published 15 Oct 2016 in cs.DB

Abstract: There is great interest in supporting imprecise queries (e.g., keyword search or natural language queries) over databases today. To support such queries, the database system is typically required to disambiguate parts of the user-specified query against the database, using whatever resources are intrinsically available to it (the database schema, data values distributions, natural LLMs etc). Often, systems will also have a user-interaction log available, which can serve as an extrinsic resource to supplement their model based on their own intrinsic resources. This leads to a problem of how best to combine the system's prior ranking with insight derived from the user-interaction log. Statistical inference techniques such as maximum likelihood or Bayesian updates from a subjective prior turn out not to apply in a straightforward way due to possible noise from user search behavior and to encoding biases endemic to the system's models. In this paper, we address such learning problem in interactive data retrieval, with specific focus on type classification for user-specified query terms. We develop a novel Bayesian smoothing algorithm, Bsmooth, which is simple, fast, flexible and accurate. We analytically establish some desirable properties and show, through experiments against an independent benchmark, that the addition of such a learning layer performs much better than standard methods.

Citations (2)

Summary

We haven't generated a summary for this paper yet.