2000 character limit reached
Hybrid Neural Models For Sequence Modelling: The Best Of Three Worlds (1909.07102v1)
Published 16 Sep 2019 in cs.LG and stat.ML
Abstract: We propose a neural architecture with the main characteristics of the most successful neural models of the last years: bidirectional RNNs, encoder-decoder, and the Transformer model. Evaluation on three sequence labelling tasks yields results that are close to the state-of-the-art for all tasks and better than it for some of them, showing the pertinence of this hybrid architecture for this kind of tasks.