2000 character limit reached
Diverse Embedding Neural Network Language Models
Published 22 Dec 2014 in cs.CL, cs.LG, and cs.NE | (1412.7063v5)
Abstract: We propose Diverse Embedding Neural Network (DENN), a novel architecture for LMs. A DENNLM projects the input word history vector onto multiple diverse low-dimensional sub-spaces instead of a single higher-dimensional sub-space as in conventional feed-forward neural network LMs. We encourage these sub-spaces to be diverse during network training through an augmented loss function. Our language modeling experiments on the Penn Treebank data set show the performance benefit of using a DENNLM.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.