Dependency Parsing-based QA System using RDF and SPARQL
Fariz [email protected]
2
Motivation
3
Motivation
4
Motivation
5
BACKGROUND
6
QA in General
• Finding an answer to natural language questions based on documents/facts
• Instead of documents, give answers• Factoid questions:– Who can dance Tango?– What did I eat this morning?– When Mahatma Gandhi was born?
7
Dependency Parsing
8
RDF & SPARQL
• RDF Data: :book1 :title "SPARQL Tutorial" .
• SPARQL Query:SELECT ?titleWHERE { :book1 :title ?title . }
9
WordNet
• Large lexical database of English words• Words are grouped into synsets• Relations among synsets: Synonymy,
antonymy, hyponymy, meronymy, troponymy
10
DBpedia• Knowledge base of Wikipedia in RDF• Data at dbpedia.org/resource/Italy:
11
DEVELOPMENT
NL Text(Facts)
NL Text(Questions)
Dependency Parser
Dependency Parser
RDFizer
SPARQLizer
RDF SPARQL
OWLOntology
13
Facts Population
1. We parse the natural language facts using the Stanford dependency parser. The result will be typed dependencies.
2. The typed dependencies are then translated into RDF format using RDFizer. The RDFizer is built using Java with Apache Jena as the Semantic Web library.
3. The resulting RDF will be consulted with an OWL ontology that contains some WordNet and DBpedia axioms to infer some new facts.
14
Query Execution
1. We parse the natural language questions using the Stanford dependency parser. The result will be typed dependencies.
2. We then translate the typed dependencies into SPARQL query format.
3. The SPARQL query is then executed over populated RDF data from the result of Facts Population.
15
Background KnowledgeWordNet
Synonymy:<http://example.org/sentence/buy> owl:sameAs
<http://example.org/sentence/purchase> .
16
Background KnowledgeWordNet
Hyponymy:• PREFIX : <http://example.org/sentence/>
CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y ?z}
• PREFIX : <http://example.org/sentence/>CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y :car}
17
Background KnowledgeWordNet
Troponymy:• PREFIX : <http://example.org/sentence/>
CONSTRUCT {:move ?y ?z} WHERE {:run ?y ?z}
• PREFIX : <http://example.org/sentence/>CONSTRUCT {?x ?y :move} WHERE {?x ?y :run}
18
Background KnowledgeDBpedia
<http://example.org/sentence/italy> owl:sameAs <http://dbpedia.org/resource/Italy>
19
IMPLEMENTATION AND RESULT
20
Program
• Java-based• Reuse Apache Jena: SW library• Reuse Stanford Parser: Typed dependencies• MorphAdorner: Lemmatization and verb
conjugation
21
A Detailed Example (1)
• Fact: Aliana bought a car• Question: Who purchased a vehicle?
22
A Detailed Example (2)
• Typed Dependencies[nsubj(bought-2, Aliana-1), root(ROOT-0,
bought-2), det(car-4, a-3), dobj(bought-2, car-4)]
23
A Detailed Example (3)RDF:<http://example.org/sentence/car> <http://example.org/sentence/det> <http://example.org/sentence/a> . <http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/bought> . <http://example.org/sentence/bought> <http://example.org/sentence/dobj> <http://example.org/sentence/car> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/aliana> .
24
A Detailed Example (4)
KB has:We have the following triple in the knowledge base
already::bought owl:sameAs :purchased .
and also the following rules:PREFIX : <http://example.org/sentence/>CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y ?z}PREFIX : <http://example.org/sentence/>CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y :car}
25
A Detailed Example (5)
Inferred facts:<http://example.org/sentence/purchased> <http://example.org/sentence/dobj> <http://example.org/sentence/vehicle> ;<http://example.org/sentence/nsubj> <http://example.org/sentence/aliana> .
26
A Detailed Example (6)
Typed dependencies of question:[nsubj(purchased-2, Who-1), root(ROOT-0,
purchased-2), det(vehicle-4, a-3), dobj(purchased-2, vehicle-4)]
27
A Detailed Example (7)
SPARQL form of question:SELECT ?x WHERE {
:vehicle :det :a .:purchased :nsubj ?x .:purchased :dobj :vehicle .:root :root :purchased }
Answer: “aliana”
28
DBpedia Integration
• By adding some background knowledge from DBpedia, one can ask more questions.
• Example of Italy data::italy owl:sameAs dbpedia:Italy .
dbpedia:Italy dbpprop:capital "Rome" .dbpedia:Enzo_Ferrari dbpedia-owl:nationality
dbpedia:Italy ; dbpprop:deathPlacedbpedia:Maranello .dbpedia:Enzo_Ferrari dbpedia-owl:childdbpedia:Piero_Ferrari ,dbpedia:Alfredo_Ferrari .
29
Example Case
• Fact = “Fariz loves Italy.”• Question = “Does Fariz love a country, whose
capital is Rome, which was the nationality of a person who passed away in Maranello and whose sons are Piero Ferrari and Alfredo Ferrari?”
• Thus the answer will be: YES, eventhough we only have a fact, Fariz loves Italy.
30
Example Case (cont.)
• Note that, the previous example, the fact is translated automatically by the system but the question is translated manually to be the following SPARQL query:ASK WHERE { :love :nsubj :fariz . :root :root :love . :love :dobj ?x . ?x dbpprop:capital "Rome" . ?y dbpedia-owl:nationality ?x ; dbpprop:deathPlacedbpedia:Maranello .?y dbpedia-owl:childdbpedia:Piero_Ferrari , dbpedia:Alfredo_Ferrari }
31
How to handle negation? (1)• Fact: I did not buy it.• RDF:<http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/bought> .
<http://example.org/sentence/bought> <http://example.org/sentence/dobj> <http://example.org/sentence/it> ; <http://example.org/sentence/neg> <http://example.org/sentence/not> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/i> .
32
How to handle negation? (2)
• Question: Who bought it?• SPARQL:SELECT ?x WHERE {:bought :nsubj ?
x . :bought :dobj :it . :root :root :bought . FILTER NOT EXISTS { [] :neg ?z . } }
33
How to handle negation? (3)
• Who did not buy it? I.QUERY: SELECT ?x WHERE
{:bought :dobj :it . :bought :neg :not . :bought :nsubj ?x . :root :root :bought }
34
How to handle tenses? (1)
• Fact (I will buy it):<http://example.org/sentence/buy> <http://example.org/sentence/aux> <http://example.org/sentence/will> ; <http://example.org/sentence/dobj> <http://example.org/sentence/it> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/i> .
35
How to handle tenses? (2)
• Who buys it?• SELECT ?x WHERE
{:root :root :buys . :buys :nsubj ?x . :buys :dobj :it . FILTER NOT EXISTS { [] :aux :will . } }
36
How to handle passive sentences?
Fact: Juliet was killed by Romeo.<http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/killed> .
<http://example.org/sentence/killed> <http://example.org/sentence/agent> <http://example.org/sentence/romeo> ; <http://example.org/sentence/nsubjpass> <http://example.org/sentence/juliet> .
37
How to handle passive sentences?
• Ontology::nsubjpass owl:equivalentProperty :dobj .
:agent owl:equivalentProperty :nsubj .
38
How to handle passive sentences?
• Who killed Juliet?SELECT ?x WHERE {:killed :nsubj ?
x . :killed :dobj :juliet . :root :root :killed . FILTER NOT EXISTS { [] :neg ?z . }}
39
DEMO - A Story about Antonio
Antonio is a famous and cool doctor. Antonio has been working for 10 years. Antonio is in Italy. Antonio can dance Salsa well. Antonio loves Maria and Karina. Antonio is also loved by Karina. Antonio never cooks. But Maria always cooks. Antonio just bought a car. Antonio must fly to Indonesia tomorrow.
40
Conclusions
• Dependency parsing-based QA system with RDF and SPARQL
• The system is also aware of negations, tenses and passive sentences
• Improvements: More advanced parsing method, more efficient inference system, richer background knowledge
41
APPENDIX: EXAMPLES
42
Working Examples
• "The Japanese girl sang the song beautifully."
• "Who sang the song beautifully?" • "Who sang the song?"• "Who sang?"
43
Working Examples
• "The Beatles do sing the song perfectly.“
• "How do The Beatles sing the song?"
44
Working Examples
• "They should sing the song well.“
• "How should they sing the song?"
45
Working Examples
• "I did buy the book, the pencil and the ruler yesterday.“
• "What did I buy?"
46
Working Examples
• "Microsoft is located in Redmond.“
• "What is located in Redmond?"
47
Working Examples
• "The man was killed by the police."
• "Who killed the man?"
48
Working Examples
• "Sam is genius, honest and big."
• "Who is big?"
49
Working Examples
• "John is rude.";
• "Is John good?"• "Is John rude?"
50
Working Examples
• "John, that is the founder of Microsoft and the initiator of Greenpeace movement, is genius, honest and cool."
• "Who is honest?"
51
Working Examples
• "Farid wants to go to Rome.“
• "Who wants to go to Rome?"• "Who wants to go?"• "Who wants?"
52
Working Examples
• "Jorge ate 10 delicious apples."
• "Who ate 10 delicious apples?"• "Who ate 10 apples?"• "Who ate apples?"
53
Working Examples
• "John is a doctor.“
• "Is John a doctor?"• "Is John a teacher?"
54
Working Examples
• "John is a good doctor."
• "Who is John?"• "What is John?"
55
Working Examples
• "John is in Alaska."• "John is at home."• "John is on the street."
• "Where is John?"
56
Working Examples
• "Apples are good for health.“
• "What are good for health?"