Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GEMRank: Global Entity Embedding For Collaborative Filtering (1811.01686v1)

Published 5 Nov 2018 in cs.IR, cs.LG, and stat.ML

Abstract: Recently, word embedding algorithms have been applied to map the entities of recommender systems, such as users and items, to new feature spaces using textual element-context relations among them. Unlike many other domains, this approach has not achieved a desired performance in collaborative filtering problems, probably due to unavailability of appropriate textual data. In this paper we propose a new recommendation framework, called GEMRank that can be applied when the user-item matrix is the sole available souce of information. It uses the concept of profile co-occurrence for defining relations among entities and applies a factorization method for embedding the users and items. GEMRank then feeds the extracted representations to a neural network model to predict user-item like/dislike relations which the final recommendations are made based on. We evaluated GEMRank in an extensive set of experiments against state of the art recommendation methods. The results show that GEMRank significantly outperforms the baseline algorithms in a variety of data sets with different degrees of density.

Summary

We haven't generated a summary for this paper yet.