Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enhanced Universal Dependency Parsing with Automated Concatenation of Embeddings (2107.02416v1)

Published 6 Jul 2021 in cs.CL and cs.LG

Abstract: This paper describes the system used in submission from SHANGHAITECH team to the IWPT 2021 Shared Task. Our system is a graph-based parser with the technique of Automated Concatenation of Embeddings (ACE). Because recent work found that better word representations can be obtained by concatenating different types of embeddings, we use ACE to automatically find the better concatenation of embeddings for the task of enhanced universal dependencies. According to official results averaged on 17 languages, our system ranks 2nd over 9 teams.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Xinyu Wang (186 papers)
  2. Zixia Jia (15 papers)
  3. Yong Jiang (195 papers)
  4. Kewei Tu (75 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.