Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Read, Tag, and Parse All at Once, or Fully-neural Dependency Parsing (1609.03441v2)

Published 12 Sep 2016 in cs.CL

Abstract: We present a dependency parser implemented as a single deep neural network that reads orthographic representations of words and directly generates dependencies and their labels. Unlike typical approaches to parsing, the model doesn't require part-of-speech (POS) tagging of the sentences. With proper regularization and additional supervision achieved with multitask learning we reach state-of-the-art performance on Slavic languages from the Universal Dependencies treebank: with no linguistic features other than characters, our parser is as accurate as a transition- based system trained on perfect POS tags.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Jan Chorowski (29 papers)
  2. Paweł Rychlikowski (9 papers)
  3. Michał Zapotoczny (2 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.