Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Correlative Preference Transfer with Hierarchical Hypergraph Network for Multi-Domain Recommendation (2211.11191v4)

Published 21 Nov 2022 in cs.IR

Abstract: Advanced recommender systems usually involve multiple domains (such as scenarios or categories) for various marketing strategies, and users interact with them to satisfy diverse demands. The goal of multi-domain recommendation (MDR) is to improve the recommendation performance of all domains simultaneously. Conventional graph neural network based methods usually deal with each domain separately, or train a shared model to serve all domains. The former fails to leverage users' cross-domain behaviors, making the behavior sparseness issue a great obstacle. The latter learns shared user representation with respect to all domains, which neglects users' domain-specific preferences. In this paper we propose $\mathsf{H3Trans}$, a hierarchical hypergraph network based correlative preference transfer framework for MDR, which represents multi-domain user-item interactions into a unified graph to help preference transfer. $\mathsf{H3Trans}$ incorporates two hyperedge-based modules, namely dynamic item transfer (Hyper-I) and adaptive user aggregation (Hyper-U). Hyper-I extracts correlative information from multi-domain user-item feedbacks for eliminating domain discrepancy of item representations. Hyper-U aggregates users' scattered preferences in multiple domains and further exploits the high-order (not only pair-wise) connections to improve user representations. Experiments on both public and production datasets verify the superiority of $\mathsf{H3Trans}$ for MDR.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Zixuan Xu (28 papers)
  2. Penghui Wei (11 papers)
  3. Shaoguo Liu (19 papers)
  4. Weimin Zhang (16 papers)
  5. Liang Wang (512 papers)
  6. Bo Zheng (205 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.