Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Context-Driven Interactive Query Simulations Based on Generative Large Language Models (2312.09631v3)

Published 15 Dec 2023 in cs.IR

Abstract: Simulating user interactions enables a more user-oriented evaluation of information retrieval (IR) systems. While user simulations are cost-efficient and reproducible, many approaches often lack fidelity regarding real user behavior. Most notably, current user models neglect the user's context, which is the primary driver of perceived relevance and the interactions with the search results. To this end, this work introduces the simulation of context-driven query reformulations. The proposed query generation methods build upon recent LLM approaches and consider the user's context throughout the simulation of a search session. Compared to simple context-free query generation approaches, these methods show better effectiveness and allow the simulation of more efficient IR sessions. Similarly, our evaluations consider more interaction context than current session-based measures and reveal interesting complementary insights in addition to the established evaluation protocols. We conclude with directions for future work and provide an entirely open experimental setup.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Björn Engelmann (7 papers)
  2. Timo Breuer (23 papers)
  3. Jana Isabelle Friese (1 paper)
  4. Philipp Schaer (63 papers)
  5. Norbert Fuhr (15 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.