Top Banner
An Extended Model of Natural An Extended Model of Natural Logic Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009
24

An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

Dec 20, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

An Extended Model of Natural An Extended Model of Natural LogicLogic

Bill MacCartney and Christopher D. ManningNLP Group

Stanford University8 January 2009

Page 2: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

2

Natural language inference Natural language inference (NLI)(NLI)

• Aka recognizing textual entailment (RTE)

• Does premise P justify an inference to hypothesis H?• An informal, intuitive notion of inference: not strict logic

• Emphasis on variability of linguistic expression

• Necessary to goal of natural language understanding (NLU)

• Can also enable semantic search, question answering, …

P Every firm polled saw costs grow more than expected,even after adjusting for inflation.

H Every big company in the poll reported cost increases.yes

Some

Some no

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 3: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

3

NLI: a spectrum of NLI: a spectrum of approachesapproaches

lexical/semanticoverlap

Jijkoun & de Rijke 2005

patternedrelationextraction

Romano et al. 2006

semanticgraph

matching

MacCartney et al. 2006Hickl et al. 2006

FOL &theoremproving

Bos & Markert 2006

robust,but shallow

deep,but brittle

naturallogic

(this work)

Problem:imprecise easily confounded by negation, quantifiers, conditionals, factive & implicative verbs, etc.

Problem:hard to translate NL to FOLidioms, anaphora, ellipsis, intensionality, tense, aspect, vagueness, modals, indexicals, reciprocals, propositional attitudes, scope ambiguities, anaphoric adjectives, non-intersective adjectives, temporal & causal relations, unselective quantifiers, adverbs of quantification, donkey sentences, generic determiners, comparatives, phrasal verbs, …

Solution?

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 4: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

4

What is natural logic?What is natural logic? ( ( natural deduction)natural deduction)

• Characterizes valid patterns of inference via surface forms• precise, yet sidesteps difficulties of translating to FOL

• A long history• traditional logic: Aristotle’s syllogisms, scholastics, Leibniz, …

• modern natural logic begins with Lakoff (1970)• van Benthem & Sánchez Valencia (1986-91): monotonicity calculus

• Nairn et al. (2006): an account of implicatives & factives

• We introduce a new theory of natural logic• extends monotonicity calculus to account for negation & exclusion

• incorporates elements of Nairn et al.’s model of implicatives

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 5: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

5

16 elementary set relations16 elementary set relations

? ?

? ?

y

x

x

y

Assign sets x, y to one of 16 relations, depending on emptiness or non-emptiness of each of four partitions

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

empty

non-empty

Page 6: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

6

16 elementary set relations16 elementary set relations

x ^ y x ‿ y

x y x ⊐ y

x ⊏ y x | y x # y

But 9 of 16 are degenerate: either x or y is either empty or universal.

I.e., they correspond to semantically vacuous expressions, which are rare outside logic textbooks.

We therefore focus on the remaining seven relations.

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 7: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

7

The set of 7 basic The set of 7 basic semantic relationssemantic relations

Venn symbol

name example

x y

equivalence couch sofa

x ⊏ y forward entailment

(strict)

crow ⊏ bird

x ⊐ y reverse entailment

(strict)

European ⊐ French

x ^ y negation(exhaustive exclusion)

human ^ nonhuman

x | y alternation(non-exhaustive

exclusion)

cat | dog

x ‿ y

cover(exhaustive non-

exclusion)

animal ‿ nonhuman

x # y independence hungry # hippoRelations are defined for all semantic types: tiny ⊏ small, hover ⊏ fly, kick ⊏ strike,this morning ⊏ today, in Beijing ⊏ in China, everyone ⊏ someone, all ⊏ most ⊏ some

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 8: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

8

|

x R y

Joining semantic relationsJoining semantic relations

fish human nonhuman^

y zS

?

?

⊏ ⋈⊏ ⊏

⊐ ⋈⊐ ⊐

^ ⋈^

R ⋈ R

⋈R R

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 9: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

9

Some joins yield unions of Some joins yield unions of relations!relations!

x | y y | z x ? z

couch | table table | sofa couch sofa

pistol | knife knife | gun pistol ⊏ gun

dog | cat cat | terrier dog ⊐ terrier

rose | orchid orchid | daisy rose | daisy

woman | frog frog | Eskimo woman # Eskimo

What is | | ?⋈

| | {, ⊏, ⊐, |, #}⋈

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 10: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

10

Of 49 join pairs, 32 yield relations in ; 17 yield unions

Larger unions convey less information — limits power of inference

In practice, any union which contains # can be approximated by #

The complete join tableThe complete join table

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 11: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

11

will depend on:1. the lexical semantic relation generated

by e: (e)2. other properties of the context x in

which e is applied

( , )

Lexical semantic relationsLexical semantic relations

x e(x)

compound expression

atomic edit: DEL, INS, SUB

semantic relation

Example: suppose x is red car

If e is SUB(car, convertible), then (e) is ⊐If e is DEL(red), then (e) is ⊏

Crucially, (e) depends solely on lexical items in e, independent of context x

But how are lexical semantic relations determined?

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 12: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

12

Lexical semantic relations: Lexical semantic relations: SUBsSUBs

(SUB(x, y)) = (x, y)

For open-class terms, use lexical resource (e.g. WordNet)for synonyms: sofa couch, forbid prohibit

⊏ for hypo-/hypernyms: crow ⊏ bird, frigid ⊏ cold, soar ⊏ rise

| for antonyms and coordinate terms: hot | cold, cat | dog

or | for proper nouns: USA United States, JFK | FDR

# for most other pairs: hungry # hippo

Closed-class terms may require special handlingQuantifiers: all ⊏ some, some ^ no, no | all, at least 4 ‿

at most 6

See paper for discussion of pronouns, prepositions, …

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 13: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

13

Lexical semantic relations: Lexical semantic relations: DELs & INSsDELs & INSs

Generic (default) case: (DEL(•)) = ⊏, (INS(•)) = ⊐• Examples: red car ⊏ car, sing ⊐ sing off-key

• Even quite long phrases: car parked outside since last week ⊏ car

• Applies to intersective modifiers, conjuncts, independent clauses, …

• This heuristic underlies most approaches to RTE!• Does P subsume H? Deletions OK; insertions penalized.

Special cases• Negation: didn’t sleep ^ did sleep

• Implicatives & factives (e.g. refuse to, admit that): discussed later

• Non-intersective adjectives: former spy | spy, alleged spy # spy

• Auxiliaries etc.: is sleeping sleeps, did sleep slept

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 14: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

14

The impact of semantic The impact of semantic compositioncompositionHow are semantic relations affected by semantic composition?

f

@

f

@

x y

?

The monotonicity calculus provides a partial answer

UP ⊏ ⊏⊐ ⊐# #

DOWN ⊏ ⊐⊐ ⊏# #

NON ⊏ #⊐ ## #

If f has monotonicity…

How is (x, y) projected by f?

But how are other relations (|, ^, ‿) projected?

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

@ means fn application

Page 15: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

15

A typology of projectivityA typology of projectivityProjectivity signatures: a generalization of monotonicity classes

negation

⊏ ⊐⊐ ⊏^ ^| ‿‿ |# #

not French ‿ not Germannot more than 4 | not less than 6

not human ^ not nonhuman

didn’t kiss ⊐ didn’t touchnot ill ⊏ not seasick

In principle, 77 possible signatures, but few actually realized

↦Each projectivity signature is a map

not happy not glad

isn’t swimming # isn’t hungry

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 16: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

16

A typology of projectivityA typology of projectivityProjectivity signatures: a generalization of monotonicity classes

Each projectivity signature is a mapIn principle, 77 possible signatures, but few actually realized

negation

⊏ ⊐⊐ ⊏^ ^| ‿‿ |# #

metallic pipe # nonferrous pipe

intersective

modification

⊏ ⊏⊐ ⊐^ || |‿ ## #

live human | live nonhumanFrench wine | Spanish wine

See paper for projectivity of various quantifiers, verbs

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 17: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

17

Projecting through multiple Projecting through multiple levelslevels

a shirtnobody can without enter

@

@

@

@

clothesnobody can without enter

@

@

@

@

Propagate semantic relation between atoms upward, according to projectivity class of each node on path to root

nobody can enter with a shirt ⊏ nobody can enter with clothes

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 18: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

18

Implicatives & factives Implicatives & factives [Nairn et al. 06][Nairn et al. 06]

signature

example

implicatives

+ / – he managed to escape

+ / o he was forced to sell

o / – he was permitted to live

implicatives

– / + he forgot to pay

– / o he refused to fight

o / + he hesitated to ask

factives + / + he admitted that he knew

– / – he pretended he was sick

o / o he wanted to fly

9 signatures, per implications (+, –, or o) in positive and negative contexts

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 19: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

19

Implicatives & factivesImplicatives & factives

signature

example(DEL)

(INS)

implicatives

+ / – he managed to escape he escaped

+ / o he was forced to sell ⊏ he sold ⊏ ⊐

o / – he was permitted to live ⊐ he lived ⊐ ⊏

implicatives

– / + he forgot to pay ^ he paid ^ ^

– / o he refused to fight | he fought | |

o / + he hesitated to ask ‿ he asked ‿ ‿

factives + / + he admitted that he knew ⊏ he knew ⊏ ⊐

– / – he pretended he was sick | he was sick | |

o / o he wanted to fly # he flew # #

We can specify relation generated by DEL or INS of each signature

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Room for variation w.r.t. infinitives, complementizers, passivation, etc.Some more intuitive when negated: he didn’t hesitate to ask | he didn’t askFactives not fully explained: he didn’t admit that he knew | he didn’t know

Page 20: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

20

Putting it all togetherPutting it all together

1. Find a sequence of edits e1, …, en which transforms p into h. Define x0 = p, xn = h, and xi = ei(xi–1) for i [1, n].

2. For each atomic edit ei:

1. Determine the lexical semantic relation (ei).

2. Project (ei) upward through the semantic composition tree of expression xi–1 to find the atomic semantic relation (xi–1, xi)

3. Join atomic semantic relations across the sequence of edits:(p, h) = (x0, xn) = (x0, x1) ⋈ … ⋈ (xi–1, xi) ⋈ … ⋈ (xn–1,

xn)

Limitations: need to find appropriate edit sequence connecting p and h;tendency of ⋈ operation toward less-informative semantic relations; lack of general mechanism for combining multiple premises

Less deductive power than FOL. Can’t handle e.g. de Morgan’s Laws.

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 21: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

21

An exampleAn example

P The doctor didn’t hesitate to recommend Prozac.

H The doctor recommended medication.yes

i ei xi lex atom join

The doctor didn’t hesitate to recommend Prozac.

1 DEL(hesitate to)The doctor didn’t recommend Prozac.

2 DEL(didn’t)The doctor recommended Prozac.

3 SUB(Prozac, medication)The doctor recommended medication.

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

‿ ||

^^ ⊏

⊏ ⊏ ⊏ yes

Page 22: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

22

Different edit orders?Different edit orders?i ei lex atom join

1 DEL(hesitate to) ‿ | |

2 DEL(didn’t) ^ ^ ⊏

3 SUB(Prozac, medication) ⊏ ⊏ ⊏

i ei lex atom join

1 DEL(didn’t) ^ ^ ^

2 DEL(hesitate to) ‿ ‿ ⊏

3 SUB(Prozac, medication) ⊏ ⊏ ⊏

i ei lex atom join

1 SUB(Prozac, medication) ⊏ ⊏ ⊏

2 DEL(hesitate to) ‿ | |

3 DEL(didn’t) ^ ^ ⊏

i ei lex atom join

1 DEL(hesitate to) ‿ | |

2 SUB(Prozac, medication) ⊏ ⊐ |

3 DEL(didn’t) ^ ^ ⊏

i ei lex atom join

1 DEL(didn’t) ^ ^ ^

2 SUB(Prozac, medication) ⊏ ⊐ |

3 DEL(hesitate to) ‿ ‿ ⊏

i ei lex atom join

1 SUB(Prozac, medication) ⊏ ⊏ ⊏

2 DEL(didn’t) ^ ^ |

3 DEL(hesitate to) ‿ ‿ ⊏

Intermediate steps may vary; final result is typically (though not necessarily) the same

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 23: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

23

Implementation & evaluationImplementation & evaluation

The NatLog system: an implementation of this model in codeFor implementation details, see [MacCartney & Manning 2008]

Evaluation on FraCaS test suite183 NLI problems, nine sections, three-way classificationAccuracy 70% overall; 87% on “relevant” sections (60% coverage)

Precision 89% overall: rarely predicts entailment wrongly

Evaluation on RTE3 test suiteLonger, more natural premises; greater diversity of inference types

NatLog alone has mediocre accuracy (59%) but good precisionHybridization with broad-coverage RTE system yields gains of 4%

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

Page 24: An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

24

Natural logic is not a universal solution for NLIMany types of inference not amenable to natural logic approach

Our inference method faces many limitations on deductive power

More work to be done in fleshing out our accountEstablishing projectivity signatures for more quantifiers, verbs, etc.

Better incorporating presuppositions

But, our model of natural logic fills an important nichePrecise reasoning on negation, antonymy, quantifiers, implicatives, …

Sidesteps the myriad difficulties of full semantic interpretation

Practical value demonstrated on FraCaS and RTE3 test suites

ConclusionConclusionNatural logic is not a universal solution for NLI

Many types of inference not amenable to natural logic approach

Our inference method faces many limitations on deductive power

More work to be done in fleshing out our accountEstablishing projectivity signatures for more quantifiers, verbs, etc.

Better incorporating presuppositions

But, our model of natural logic fills an important nichePrecise reasoning on negation, antonymy, quantifiers, implicatives, …

Sidesteps the myriad difficulties of full semantic interpretation

Practical value demonstrated on FraCaS and RTE3 test suites

Introduction • Semantic Relations • Joins • Lexical Relations • Projectivity • Implicatives • Inference • Evaluation • Conclusion

:-) Thanks! Questions?