Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CT4Rec: Simple yet Effective Consistency Training for Sequential Recommendation (2112.06668v3)

Published 13 Dec 2021 in cs.IR

Abstract: Sequential recommendation methods are increasingly important in cutting-edge recommender systems. Through leveraging historical records, the systems can capture user interests and perform recommendations accordingly. State-of-the-art sequential recommendation models proposed very recently combine contrastive learning techniques for obtaining high-quality user representations. Though effective and performing well, the models based on contrastive learning require careful selection of data augmentation methods and pretext tasks, efficient negative sampling strategies, and massive hyper-parameters validation. In this paper, we propose an ultra-simple alternative for obtaining better user representations and improving sequential recommendation performance. Specifically, we present a simple yet effective \textbf{C}onsistency \textbf{T}raining method for sequential \textbf{Rec}ommendation (CT4Rec) in which only two extra training objectives are utilized without any structural modifications and data augmentation. Experiments on three benchmark datasets and one large newly crawled industrial corpus demonstrate that our proposed method outperforms SOTA models by a large margin and with much less training time than these based on contrastive learning. Online evaluation on real-world content recommendation system also achieves 2.717\% improvement on the click-through rate and 3.679\% increase on the average click number per capita. Further exploration reveals that such a simple method has great potential for CTR prediction. Our code is available at \url{https://github.com/ct4rec/CT4Rec.git}.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Chong Liu (104 papers)
  2. Xiaoyang Liu (21 papers)
  3. Rongqin Zheng (4 papers)
  4. Lixin Zhang (27 papers)
  5. Xiaobo Liang (6 papers)
  6. Juntao Li (89 papers)
  7. Lijun Wu (113 papers)
  8. Min Zhang (630 papers)
  9. Leyu Lin (43 papers)
Citations (8)