Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Re-ranking Model for Dependency Parser with Recursive Convolutional Neural Network (1505.05667v1)

Published 21 May 2015 in cs.CL, cs.LG, and cs.NE

Abstract: In this work, we address the problem to model all the nodes (words or phrases) in a dependency tree with the dense representations. We propose a recursive convolutional neural network (RCNN) architecture to capture syntactic and compositional-semantic representations of phrases and words in a dependency tree. Different with the original recursive neural network, we introduce the convolution and pooling layers, which can model a variety of compositions by the feature maps and choose the most informative compositions by the pooling layers. Based on RCNN, we use a discriminative model to re-rank a $k$-best list of candidate dependency parsing trees. The experiments show that RCNN is very effective to improve the state-of-the-art dependency parsing on both English and Chinese datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Chenxi Zhu (9 papers)
  2. Xipeng Qiu (257 papers)
  3. Xinchi Chen (15 papers)
  4. Xuanjing Huang (287 papers)
Citations (47)

Summary

We haven't generated a summary for this paper yet.