Papers
Topics
Authors
Recent
2000 character limit reached

Looking around you: external information enhances representations for event sequences

Published 14 Feb 2025 in cs.LG | (2502.10205v2)

Abstract: Representation learning produces models in different domains, such as store purchases, client transactions, and general people's behaviour. However, such models for event sequences usually process each sequence in isolation, ignoring context from ones that co-occur in time. This limitation is particularly problematic in domains with fast-evolving conditions, like finance and e-commerce, or when certain sequences lack recent events. We develop a method that aggregates information from multiple user representations, augmenting a specific user for a scenario of multiple co-occurring event sequences, achieving better quality than processing each sequence independently. Our study considers diverse aggregation approaches, ranging from simple pooling techniques to trainable attention-based Kernel attention aggregation, that can highlight more complex information flow from other users. The proposed methods operate on top of an existing encoder and support its efficient fine-tuning. Across six diverse event sequence datasets (finance, e-commerce, education, etc.) and downstream tasks, Kernel attention improves ROC-AUC scores, both with and without fine-tuning, while mean pooling yields a smaller but still significant gain.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.