Transition-Based Dependency Parsing
12 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Transition-Based Dependency Parsing
Most implemented papers
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
We propose a technique for learning representations of parser states in transition-based dependency parsers.
Efficient Structured Inference for Transition-Based Parsing with Neural Networks and Error States
Transition-based approaches based on local classification are attractive for dependency parsing due to their simplicity and speed, despite producing results slightly below the state-of-the-art.
A Novel Neural Network Model for Joint POS Tagging and Graph-based Dependency Parsing
We present a novel neural network model that learns POS tagging and graph-based dependency parsing jointly.
Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set
We first present a minimal feature set for transition-based dependency parsing, continuing a recent trend started by Kiperwasser and Goldberg (2016a) and Cross and Huang (2016a) of using bi-directional LSTM features.
Distilling Knowledge for Search-based Structured Prediction
Many natural language processing tasks can be modeled into structured prediction and solved as a search problem.
Joint Learning of POS and Dependencies for Multilingual Universal Dependency Parsing
This paper describes the system of team LeisureX in the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies.
Tree-Stack LSTM in Transition Based Dependency Parsing
We introduce tree-stack LSTM to model state of a transition based parser with recurrent neural networks.
Bidirectional Transition-Based Dependency Parsing
Traditionally, a transitionbased dependency parser processes an input sentence and predicts a sequence of parsing actions in a left-to-right manner.