Dependency Parsing as Head Selection Xingxing Zhang, Jianpeng Cheng, Mirella Lapata Institute for Language, Cognition and Computation University of Edinburgh [email protected]April 6, 2017 Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 1 / 18
32
Embed
Dependency Parsing as Head Selectionhomepages.inf.ed.ac.uk/s1270921/res/slides/dense.pdf · Dependency Parsing as Head Selection Xingxing Zhang, Jianpeng Cheng, Mirella Lapata Institute
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Dependency Parsing as Head Selection
Xingxing Zhang, Jianpeng Cheng, Mirella Lapata
Institute for Language, Cognition and ComputationUniversity of Edinburgh
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 1 / 18
Dependency Parsing
Dependency Parsing is the task of transforming a sentenceS = (root,w1,w2, . . . ,wN) into a directed tree originating out of root.
Parsing Algorithms
Transition-based ParsingGraph-based Parsing
Our parser is neither Transition-based nor Graph-based (duringtraining)
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 2 / 18
Dependency Parsing
Dependency Parsing is the task of transforming a sentenceS = (root,w1,w2, . . . ,wN) into a directed tree originating out of root.
Parsing Algorithms
Transition-based ParsingGraph-based Parsing
Our parser is neither Transition-based nor Graph-based (duringtraining)
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 2 / 18
Transition-based Parsing
Data Structure
Buffer, Stack, Arc Set
Parsing:
Choose an action fromSHIFTREDUCE-LeftREDUCE-Right
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 3 / 18
Graph-based Parsing
A Sentence → A Directed Complete Graph
(Graphs from Kubler et al., 2009)
Parsing: Finding Maximum Spanning Tree
Chu-Liu-Edmond algorithm (Chu and Liu, 1965)Eisner algorithm (Eisner 1996)
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 4 / 18
Recent Advances
Mostly replacing discrete features with Neural Network features.
Transition-based Parsers
Feed-Forward NN features (Chen and Manning, 2014)Bi-LSTM features (Kiperwasser and Goldberg, 2016)Stack LSTM: Buffer, Stack and Action Sequences modeled byStack-LSTMs (Dyer et al., 2015)
Graph-based Parsers
Tensor Decomposition features (Lei et al., 2014)Feed-Forward NN features (Pei et al., 2015)Bi-LSTM features (Kiperwasser and Goldberg, 2016)
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 5 / 18
Do we need a transition system or graph algorithm?
root kids love candy
An important fact: Every word has only one head!
Why not just learn to select the head?
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 6 / 18
Do we need a transition system or graph algorithm?
root kids love candy
An important fact: Every word has only one head!
Why not just learn to select the head?
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 6 / 18
Do we need a transition system or graph algorithm?
root kids love candy
An important fact: Every word has only one head!
Why not just learn to select the head?
Zhang et al. (Univ. of Edinburgh) DeNSe: Dependency Neural Selection April 6, 2017 6 / 18