Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hierarchical Neural Language Models for Joint Representation of Streaming Documents and their Content (1606.08689v1)

Published 28 Jun 2016 in cs.CL and cs.IR

Abstract: We consider the problem of learning distributed representations for documents in data streams. The documents are represented as low-dimensional vectors and are jointly learned with distributed vector representations of word tokens using a hierarchical framework with two embedded neural LLMs. In particular, we exploit the context of documents in streams and use one of the LLMs to model the document sequences, and the other to model word sequences within them. The models learn continuous vector representations for both word tokens and documents such that semantically similar documents and words are close in a common vector space. We discuss extensions to our model, which can be applied to personalized recommendation and social relationship mining by adding further user layers to the hierarchy, thus learning user-specific vectors to represent individual preferences. We validated the learned representations on a public movie rating data set from MovieLens, as well as on a large-scale Yahoo News data comprising three months of user activity logs collected on Yahoo servers. The results indicate that the proposed model can learn useful representations of both documents and word tokens, outperforming the current state-of-the-art by a large margin.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Nemanja Djuric (24 papers)
  2. Hao Wu (623 papers)
  3. Vladan Radosavljevic (14 papers)
  4. Mihajlo Grbovic (13 papers)
  5. Narayan Bhamidipati (8 papers)
Citations (76)