1 Quasi-Synchronous Grammars Based on key observations in MT: translated sentences often have some isomorphic syntactic structure, but not usually in entirety. the strictness of the isomorphism may vary across words or syntactic rules. Key idea: Unlike some synchronous grammars (e.g. SCFG, which is more strict and rigid), QG defines a monolingual grammar for the target tree, “inspired” by the source tree.
34
Embed
1 Quasi-Synchronous Grammars Based on key observations in MT: translated sentences often have some isomorphic syntactic structure, but not usually in.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
Quasi-Synchronous Grammars Based on key observations in MT:
translated sentences often have some isomorphic syntactic structure, but not usually in entirety.
the strictness of the isomorphism may vary across words or syntactic rules.
Key idea: Unlike some synchronous grammars (e.g. SCFG,
which is more strict and rigid), QG defines a monolingual grammar for the target tree, “inspired” by the source tree.
2
Quasi-Synchronous Grammars In other words, we model the generation of
the target tree, influenced by the source tree (and their alignment)
QA can be thought of as extremely free monolingual translation.
The linkage between question and answer trees in QA is looser than in MT, which gives a bigger edge to QG.
3
Model Works on labeled dependency parse trees Learn the hidden structure (alignment between Q and
A trees) by summing out ALL possible alignments
One particular alignment tells us both the syntactic configurations and the word-to-word semantic correspondences
An example…
question answer
answerparse tree
questionparse tree
an alignment
BushNNP
person
metVBD
FrenchJJ
location
presidentNN
Jacques ChiracNNP
person
whoWP
qword
leaderNN
isVB
theDT
FranceNNP
location
Q: A:$
root$
root
root
subj obj
det of
root
subj with
nmod
nmod
BushNNP
person
metVBD
FrenchJJ
location
presidentNN
Jacques ChiracNNP
person
whoWP
qword
leaderNN
isVB
theDT
FranceNNP
location
Q: A:$
root$
root
root
subj obj
det of
root
subj with
nmod
nmod
BushNNP
person
metVBD
FrenchJJ
location
presidentNN
Jacques ChiracNNP
person
isVB
Q: A:$
root$
root
root root
subj with
nmod
nmod
root)|P(root
noNE)|P(noNE
VBD)| P(VB
Our model makes local Markov assumptions to allow efficient computation via Dynamic Programming (details in paper)
given its parent, a word is independent of all other words (including siblings).