Papers
Topics
Authors
Recent
2000 character limit reached

Neural or Statistical: An Empirical Study on Language Models for Chinese Input Recommendation on Mobile (1907.05340v1)

Published 9 Jul 2019 in cs.CL

Abstract: Chinese input recommendation plays an important role in alleviating human cost in typing Chinese words, especially in the scenario of mobile applications. The fundamental problem is to predict the conditional probability of the next word given the sequence of previous words. Therefore, statistical LLMs, i.e.~n-grams based models, have been extensively used on this task in real application. However, the characteristics of extremely different typing behaviors usually lead to serious sparsity problem, even n-gram with smoothing will fail. A reasonable approach to tackle this problem is to use the recently proposed neural models, such as probabilistic neural LLM, recurrent neural network and word2vec. They can leverage more semantically similar words for estimating the probability. However, there is no conclusion on which approach of the two will work better in real application. In this paper, we conduct an extensive empirical study to show the differences between statistical and neural LLMs. The experimental results show that the two different approach have individual advantages, and a hybrid approach will bring a significant improvement.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.