Papers
Topics
Authors
Recent
2000 character limit reached

Distillation based Multi-task Learning: A Candidate Generation Model for Improving Reading Duration

Published 14 Feb 2021 in cs.IR | (2102.07142v1)

Abstract: In feeds recommendation, the first step is candidate generation. Most of the candidate generation models are based on CTR estimation, which do not consider user's satisfaction with the clicked item. Items with low quality but attractive title (i.e., click baits) may be recommended to the user, which worsens the user experience. One solution to this problem is to model the click and the reading duration simultaneously under the multi-task learning (MTL) framework. There are two challenges in the modeling. The first one is how to deal with the zero duration of the negative samples, which does not necessarily indicate dislikes. The second one is how to perform multi-task learning in the candidate generation model with double tower structure that can only model one single task. In this paper, we propose an distillation based multi-task learning (DMTL) approach to tackle these two challenges. We model duration by considering its dependency of click in the MTL, and then transfer the knowledge learned from the MTL teacher model to the student candidate generation model by distillation. Experiments conducted on dataset gathered from traffic logs of Tencent Kandian's recommender system show that the proposed approach outperforms the competitors significantly in modeling duration, which demonstrates the effectiveness of the proposed candidate generation model.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.