Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A New Hybrid-parameter Recurrent Neural Networks for Online Handwritten Chinese Character Recognition (1711.02809v2)

Published 8 Nov 2017 in cs.CV

Abstract: The recurrent neural network (RNN) is appropriate for dealing with temporal sequences. In this paper, we present a deep RNN with new features and apply it for online handwritten Chinese character recognition. Compared with the existing RNN models, three innovations are involved in the proposed system. First, a new hidden layer function for RNN is proposed for learning temporal information better. we call it Memory Pool Unit (MPU). The proposed MPU has a simple architecture. Second, a new RNN architecture with hybrid parameter is presented, in order to increasing the expression capacity of RNN. The proposed hybrid-parameter RNN has parameter changes when calculating the iteration at temporal dimension. Third, we make a adaptation that all the outputs of each layer are stacked as the output of network. Stacked hidden layer states combine all the hidden layer states for increasing the expression capacity. Experiments are carried out on the IAHCC-UCAS2016 dataset and the CASIA-OLHWDB1.1 dataset. The experimental results show that the hybrid-parameter RNN obtain a better recognition performance with higher efficiency (fewer parameters and faster speed). And the proposed Memory Pool Unit is proved to be a simple hidden layer function and obtains a competitive recognition results.

Citations (3)

Summary

We haven't generated a summary for this paper yet.