Advanced Artificial Intelligence

Post on 22-Jan-2016

25 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Advanced Artificial Intelligence. Part II. Statistical NLP. Probabilistic Logic Learning Wolfram Burgard, Luc De Raedt, Bernhard Nebel, Lars Schmidt-Thieme. Many slides taken from Kristian Kersting and for Logic From Peter Flach’s Simply Logical. Overview. - PowerPoint PPT Presentation

Transcript

Part II. Statistical NLP

Advanced Artificial Intelligence

Probabilistic Logic Learning

Wolfram Burgard, Luc De Raedt, Bernhard Nebel, Lars Schmidt-Thieme

Many slides taken from Kristian Kersting and for Logic From Peter Flach’s Simply Logical

Overview

Expressive power of PCFGs, HMMs, BNs still limited• First order logic is more expressive

Why not combine logic with probabilities ?• Probabilistic logic learning

Short recap of logic (programs) Stochastic logic programs

• Extend PCFGs Bayesian logic programs

• Extend Bayesian Nets Logical HMMs

• Extend HMMs

ContextOne of the key open questions of artificial intelligence concerns

"probabilistic logic learning",

i.e. the integration of probabilistic reasoning with

machine learning.

first order logic representations and

Sometimes called Statistical Relational Learning

So far

We have largely been looking at probabilistic representations and ways of learning these from data• BNs, HMMs, PCFGs

Now, we are going to look at their expressive power, and make traditional probabilistic representations more expressive using logic • Probabilistic First Order Logics• Lift BNs, HMMs, PCFGs to more expressive

frameworks• Upgrade also the underlying algorithms

London Underground example

p.4

LRT Registered User No. 94/1954

BondStreet

GreenPark

OxfordCircus

PiccadillyCircus

CharingCross

LeicesterSquare

TottenhamCourt Road

JUBILEE BAKERLOO NORTHERN

CENTRAL

PICCADILLY

VICTORIA

UNDERGROUND

p.3

London Underground in Prolog (1) connected(bond_street,oxford_circus,central).

connected(oxford_circus,tottenham_court_road,central).connected(bond_street,green_park,jubilee).connected(green_park,charing_cross,jubilee).connected(green_park,piccadilly_circus,piccadilly).connected(piccadilly_circus,leicester_square,piccadilly).connected(green_park,oxford_circus,victoria).connected(oxford_circus,piccadilly_circus,bakerloo).connected(piccadilly_circus,charing_cross,bakerloo).connected(tottenham_court_road,leicester_square,northern).connected(leicester_square,charing_cross,northern).

Symmetric facts now shown !!!

p.3-4

London Underground in Prolog (2)Two stations are nearby if they are on the same line with at most one other station in between (symmetric facts not shown)

nearby(bond_street,oxford_circus).nearby(oxford_circus,tottenham_court_road).nearby(bond_street,tottenham_court_road).nearby(bond_street,green_park).nearby(green_park,charing_cross).nearby(bond_street,charing_cross).nearby(green_park,piccadilly_circus).

or betternearby(X,Y):-connected(X,Y,L).nearby(X,Y):-connected(X,Z,L),connected(Z,Y,L).

Facts: unconditional truthsRules/Clauses: conditional truthsBoth definitions are equivalent.

likes(peter,S):-student_of(S,peter).likes(peter,S):-student_of(S,peter).

“Peter likes anybody who is his student.”

clauseclause

atomsatoms

constantconstant variablevariable

termsterms

p.25

Clauses are universally quantified !!!:- denotes implication

p.8

Recursion (2)

A station is reachable from another if they are on the same line, or with one, two, … changes:

reachable(X,Y):-connected(X,Y,L).reachable(X,Y):-connected(X,Z,L1),connected(Z,Y,L2).reachable(X,Y):-connected(X,Z1,L1),connected(Z1,Z2,L2), connected(Z2,Y,L3).…

or betterreachable(X,Y):-connected(X,Y,L).reachable(X,Y):-connected(X,Z,L),reachable(Z,Y).

Substitutions

A substitution maps variables to terms: • {S->maria}

A substitution can be applied to a clause: • likes(peter,maria):-student_of(maria,peter).

The resulting clause is said to be an instance of the original clause, and a ground instance if it does not contain variables.

Each instance of a clause is among its logical consequences.

p.26

route

tottenham_court_road

leicester_square

route

noroute

p.12

Structured terms (2)

reachable(X,Y,noroute):-connected(X,Y,L).reachable(X,Y,route(Z,R)):-connected(X,Z,L), reachable(Z,Y,R).

?-reachable(oxford_circus,charing_cross,R).R = route(tottenham_court_road,route(leicester_square,noroute));R = route(piccadilly_circus,noroute);R = route(picadilly_circus,route(leicester_square,noroute))

functorfunctor

.

tottenham_court_road

leicester_square

.

[]

p.13-4

Lists (3)reachable(X,Y,[]):-connected(X,Y,L).reachable(X,Y,[Z|R]):-connected(X,Z,L), reachable(Z,Y,R).

?-reachable(oxford_circus,charing_cross,R).R = [tottenham_court_road,leicester_square];R = [piccadilly_circus];R = [picadilly_circus,leicester_square]

list functorlist functor

Answering queries (1)

Query:which station is nearby Tottenham Court Road?

?- nearby(tottenham_court_road, W).Prefix ?- means it‘s a query and not a fact.

Answer to query is: {W -> leicester_square}a so-called substitution.

When nearby defined by facts, substitution found by unification.

clauseclause

factfact

empty queryempty query

answer substitutionanswer substitution

substitutionsubstitution

queryquery

?-nearby(tottenham_court_road,W)

nearby(X1,Y1):-connected(X1,Y1,L1)

Fig.1.2, p.7

Proof tree

connected(tottenham_court_road,leicester_square,northern)

[]

{W->leicester_square, L1->northern}

?-connected(tottenham_court_road,W,L1)

{X1->tottenham_court_road, Y1->W}

Recall from AI course

Unification to unify two different terms Resolution inference rule Refutation proofs, which derive the

empty clause SLD-tree, which summarizes all

possible proofs (left to right) for a goal

:-teaches(peter,ai_techniques)

:-teaches(peter,expert_systems)

:-teaches(peter,computer_science)

?-student_of(S,peter)

SLD-tree: one path for each proof-tree

:-follows(S,C),teaches(peter,C):-follows(S,C),teaches(peter,C)

:-teaches(peter,expert_systems)

:-teaches(peter,computer_science) :-teaches(peter,ai_techniques)

?-student_of(S,peter)

student_of(X,T):-follows(X,C),teaches(T,C).follows(paul,computer_science).follows(paul,expert_systems).follows(maria,ai_techniques).teaches(adrian,expert_systems).teaches(peter,ai_techniques).teaches(peter,computer_science).

p.44-5

[][]

The least Herbrand model

Definition:• The set of all ground facts that are logically

entailed by the program All ground facts not in the LHM are false … LHM be computed as follows:

• M0 = {}; M1 = { true }; i:=0• while Mi =\= Mi+1 do

i := i +1; Mi := { h | h:- b1, …, bn is clause and there is a substitution such

that all bi Mi-1 }

• Mi contains all true facts, all others are false

Example LHM

KB: p(a,b). a(X,Y) :- p(X,Y). p(b,c). a(X,Y) :- p(X,Z), a(Z,Y).M0 = emtpy; M1 = { true }M2 = { true, p(a,b), p(b,c) }M3 = M2 U {a(a,b), a(b,c) }M4 = M3 U { a(a,c) }M5 = M4 ...

Stochastic Logic Programs

Recall :• Prob. Regular Grammars• Prob. Context-Free Grammars

What about Prob. Turing Machines ? Or Prob. Grammars ?• Stochastic logic programs combine

probabilistic reasoning in the style of PCFGs with the expressive power of a programming language.

Recall PCFGs

We defined

Stochastic Logic Programs

Correspondence between CFG - SLP• Symbols - Predicates• Rules - Clauses• Derivations - SLD-derivations/Proofs

So, • a stochastic logic program is an annotated logic

program. • Each clause has an associated probability label.

The sum of the probability labels for clauses defining a particular predicate is equal to 1.

An Example

:-card(a,s)

:-rank(a), suit(s)

:-suit(s)

[]

Prob derivation= 1 . 0.125 . 0.25

Example

s([the,turtle,sleeps],[]) ?

SLPs : Key Ideas

Example

Cards :• card(R,S) - no proof with R in {a,7,8,9…}

and S in { d,h,s,c} fails• For each card, there is a unique refutation• So,

Consider

same_suit(S,S) :- suit(S), suit(S).

In total 16 possible derivations, only 4 will succeed, so

Another example (due to Cussens)

Questions we can ask (and answer)about SLPs

Answers

The algorithmic answers to these questions, again extend those of PCFGs and HMMs, in particular,• Tabling is used (to record probabilities of partial

proofs and intermediate atoms)• Failure Adjusted EM (FAM) is used to solve

parameter re-estimation problem Additional hidden variables range over

• Possible refutations and derivations for observed atoms• Topic of recent research• Freiburg : learning from refutations (instead of atoms),

combined with structure learning

Sampling

PRGs, PCFGs, and SLPs can also be used for sampling sentences, ground atoms that follow from the program

Rather straightforward. Consider SLPs:• Probabilistically explore SLD-tree• At each step, select possible resolvents using the

probability labels attached to clauses • If derivation succeeds, return corresponding

(ground) atom• If derivation fails, then restart.

Bayesian Networks [Pearl 91]

Qualitative part:

Directed acyclic graph Nodes - random vars. Edges - direct influence

Compact representation of joint probability distributions

Quantitative part: Set of conditional probability distributions

0.9 0.1

e

b

e

0.2 0.8

0.01 0.99

0.9 0.1

be

b

b

e

BE P(A | B,E)Earthquake

JohnCalls

Alarm

MaryCalls

Burglary

P(E,B,A,M,J)

Together:Define a unique distribution in a compact, factored form

P(E,B,A,M,J)=P(E) * P(B) * P(A|E,B) * P(M|A) * P(J|A)

Traditional Approaches

P(j) = P(j|a) * P(m|a) * P(a|e,b) * P(e) * P(b)

+ P(j|a) * P(m|a) * P(a|e,b) * P(e) * P(b)

0.9 0.1

e

b

e

0.2 0.8

0.01 0.99

0.9 0.1

be

b

b

e

BE P(A | B,E)Earthquake

JohnCalls

Alarm

MaryCalls

Burglary

...

+ P(j|a) * P(m|a) * P(a|e,b) * P(e) * P(b)

burglary.

earthquake.

alarm :- burglary, earthquake.

marycalls :- alarm.

johncalls :- alarm.

Bayesian Networks [Pearl 91]

Expressiveness Bayesian Nets

A Bayesian net defines a probability distribution over a propositional logic

Essentially, the possible states (worlds) are propositional interpretations

But propositional logic is severely limited in expressive power, therefore consider combining BNs with logic programs• Bayesian logic programs• Actually, a BLP + some background knowledge

generates a BN• So, BLP is a kind of BN template !!!

0.9 0.1e

b

e0.2 0.8

0.01 0.990.9 0.1

be

b

b

e

BE P(A | B,E)Earthquake

JohnCalls

Alarm

MaryCalls

Burglary

Bayesian Logic Programs (BLPs)

alarm/0

earthquake/0 burglary/0

maryCalls/0 johnCalls/0

alarm

earthquake burglary

0.9 0.1e

b

e0.2 0.8

0.01 0.990.9 0.1

be

b

b

e

BE P(A | B,E)

local BN fragment

Rule Graph

alarm :- earthquake, burglary.

[Kersting, De Raedt]

bt

pc mc

Person

ba(0.0,0.0,1.0,0.0)

.........

aa(1.0,0.0,0.0,0.0)

mc(Person)pc(Person)bt(Person)

Bayesian Logic Programs (BLPs)

bt/1

pc/1 mc/1

argument

predicate

atombt(Person) :- pc(Person),mc(Person).

variable

Rule Graph

mc

pc mc

Person

mother

ba(0.5,0.5,0.0)

.........

aa(1.0,0.0,0.0)

mc(Mother)pc(Mother)mc(Person)

Mother

[Kersting, De Raedt]

Bayesian Logic Programs (BLPs)

bt/1

pc/1 mc/1

pc(Person) | father(Father,Person), pc(Father),mc(Father).

mc(Person) | mother(Mother,Person), pc(Mother),mc(Mother).

bt(Person) | pc(Person),mc(Person).

mc

pc mc

Person

mother

ba(0.5,0.5,0.0)

.........

aa(1.0,0.0,0.0)

mc(Mother)pc(Mother)mc(Person)

Mother

[Kersting, De Raedt]

Bayesian Logic Programs (BLPs)

father(rex,fred). mother(ann,fred).

father(brian,doro). mother(utta, doro).

father(fred,henry). mother(doro,henry).

mc(rex)

bt(rex)

pc(rex)mc(ann) pc(ann)

bt(ann)

mc(fred) pc(fred)

bt(fred)

mc(brian)

bt(brian)

pc(brian)mc(utta) pc(utta)

bt(utta)

mc(doro) pc(doro)

bt(doro)

mc(henry)pc(henry)

bt(henry)

pc(Person) | father(Father,Person), pc(Father),mc(Father).

mc(Person) | mother(Mother,Person), pc(Mother),mc(Mother).

bt(Person) | pc(Person),mc(Person).

Bayesian Network induced over least Herbrand model

Bayesian logic programs

Computing the ground BN (the BN that defines the semantics)• Compute the least Herbrand Model of the BLP • For each clause H | B1, … BN with CPD

if there is a substitution such that {H , B1 , …,BN } subset LHM, then H ’s parents include B1 , …,BN , and with CPD specified by the clause

• Delete logical atoms from BN (as their truth-value is known) - e.g. mother, father in the example

• Possibly apply aggregation and combining rules For specific queries, only part of the resulting BN

is necessary, the support net, cf. Next slides

Procedural Semantics

mc(rex)

bt(rex)

pc(rex)mc(ann) pc(ann)

bt(ann)

mc(fred) pc(fred)

bt(fred)

mc(brian)

bt(brian)

pc(brian)mc(utta) pc(utta)

bt(utta)

mc(doro) pc(doro)

bt(doro)

mc(henry)pc(henry)

bt(henry)

P(bt(ann)) ?

Procedural Semantics

mc(rex)

bt(rex)

pc(rex)mc(ann) pc(ann)

bt(ann)

mc(fred) pc(fred)

bt(fred)

mc(brian)

bt(brian)

pc(brian)mc(utta) pc(utta)

bt(utta)

mc(doro) pc(doro)

bt(doro)

mc(henry)pc(henry)

bt(henry)

P(bt(ann), bt(fred)) ?

P(bt(ann)| bt(fred)) =

P(bt(ann),bt(fred))

P(bt(fred))

Bayes‘ rule

Combining Rules

P(A|B,C)

P(A|B) and P(A|C)

CR

Any algorithm which has an empty output if and only if the input is empty combines a set of CPDs into a single (combined) CPD

E.g. noisy-or

prepared(Student,Topic) | read(Student,Book),

discusses(Book,Topic).

prepared

Studentread

discussesBook

Topic

Combining Partial Knowledge

prepared(s1,bn)

discusses(b1,bn)

prepared(s2,bn)

discusses(b2,bn)

variable # of parents for prepared/2 due to read/2• whether a student prepared a topic depends

on the books she read

CPD only for one book-topic pair

prepared(Student,Topic) | read(Student,Book),

discusses(Book,Topic).

prepared

Studentread

discussesBook

Topic

Summary BLPs

bt

pc mc

Person

bt

pc mc

Person

mc

pc mc

Person

mother

Mother

mc

pc mc

Person

mother

Mother

pc

pc mc

Person

father

Father

pc

pc mc

Person

father

Father

bt/1

pc/1 mc/1

bt/1

pc/1 mc/1

pc(Person) | father(Father,Person), pc(Father),mc(Father).mc(Person) | mother(Mother,Person), pc(Mother),mc(Mother).

bt(Person) | pc(Person),mc(Person).

Underlying logic pogram

+ (macro) CPDs

=Joint probability distribution over the least

Herbrand interpretation

+ CRs

+ Consequence operator

ba(0.5,0.5,0.0)

.........

aa(1.0,0.0,0.0)

mc(Mother)pc(Mother)mc(Person)

noisy-or, ...

If the body holds then the head holds, too.

mc(rex)

bt(rex)

pc(rex)mc(ann) pc(ann)

bt(ann)

mc(fred) pc(fred)

bt(fred)

mc(brian)

bt(brian)

pc(brian)mc(utta ) pc(utta)

bt(utta)

mc(doro) pc(doro)

bt(doro)

mc(henry)pc(henry)

bt(henry)

mc(rex)

bt(rex)

pc(rex)mc(ann) pc(ann)

bt(ann)

mc(fred) pc(fred)

bt(fred)

mc(brian)

bt(brian)

pc(brian)mc(utta ) pc(utta)

bt(utta)

mc(doro) pc(doro)

bt(doro)

mc(henry)pc(henry)

bt(henry)

=Conditional independencies

encoded in the induced BN

structure

Local

probability

models

Bayesian Logic Programs- Examples

% apriori nodesnat(0).

% aposteriori nodesnat(s(X)) | nat(X).

nat(0) nat(s(0)) nat(s(s(0)) ...MC

% apriori nodesstate(0).

% aposteriori nodesstate(s(Time)) | state(Time).output(Time) | state(Time)

state(0)

output(0)

state(s(0))

output(s(0))

...HMM

% apriori nodesn1(0).

% aposteriori nodesn1(s(TimeSlice) | n2(TimeSlice).n2(TimeSlice) | n1(TimeSlice).n3(TimeSlice) | n1(TimeSlice), n2(TimeSlice).

n1(0)

n2(0)

n3(0)

n1(s(0))

n2(s(0))

n3(s(0))

...DBN

pure P

rolo

g

Learning BLPs from Interpretations

Model(1)

earthquake=yes,

burglary=no,

alarm=?,

marycalls=yes,

johncalls=no

Model(1)

earthquake=yes,

burglary=no,

alarm=?,

marycalls=yes,

johncalls=no

Earthquake

JohnCalls

Alarm

MaryCalls

Burglary

Model(2)

earthquake=no,

burglary=no,

alarm=no,

marycalls=no,

johncalls=no

Model(2)

earthquake=no,

burglary=no,

alarm=no,

marycalls=no,

johncalls=no

Model(3)

earthquake=?,

burglary=?,

alarm=yes,

marycalls=yes,

johncalls=yes

Model(3)

earthquake=?,

burglary=?,

alarm=yes,

marycalls=yes,

johncalls=yes

Data case: • Random Variable + States = (partial) Herbrand interpretation

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Model(2)

bt(cecily)=ab,

pc(henry)=a,

mc(fred)=?,

bt(kim)=a,

pc(bob)=b

Model(2)

bt(cecily)=ab,

pc(henry)=a,

mc(fred)=?,

bt(kim)=a,

pc(bob)=b

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

Bloodtype example

Learning BLPs from Interpretations

Parameter Estimation - BLPs

bt

pc mc

Person

bt

pc mc

Person

mc

pc mc

Person

mother

Mother

mc

pc mc

Person

mother

Mother

pc

pc mc

Person

father

Father

pc

pc mc

Person

father

Father

bt/1

pc/1 mc/1

bt/1

pc/1 mc/1

ba(0.5,0.5,0.0)

.........

aa(1.0,0.0,0.0)

mc(Mother)pc(Mother)mc(Person)

ba(0.5,0.5,0.0)

.........

aa(1.0,0.0,0.0)

mc(Mother)pc(Mother)mc(Person)

Database

D

+bt

pc mc

Person

bt

pc mc

Person

mc

pc mc

Person

mother

Mother

mc

pc mc

Person

mother

Mother

pc

pc mc

Person

father

Father

pc

pc mc

Person

father

Father

bt/1

pc/1 mc/1

bt/1

pc/1 mc/1

Underlying

Logic program

L

Learning

Algorithm

Parameter

Parameter Estimation – BLPs Estimate the CPD entries that best fit the data

„Best fit“: ML parameters

argmaxP( data | logic program,

argmaxlog P( data | logic program,

Reduces to problem to estimate parameters of a Bayesian networks:

given structure,

partially observed random variables

Parameter Estimation – BLPs

bt

pc mc

Person

bt

pc mc

Person

mc

pc mc

Person

mother

Mother

mc

pc mc

Person

mother

Mother

pc

pc mc

Person

father

Father

pc

pc mc

Person

father

Father

bt/1

pc/1 mc/1

bt/1

pc/1 mc/1

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Model(2)

bt(cecily)=ab,

bt(henry)=a,

bt(fred)=?,

bt(kim)=a,

bt(bob)=b

Model(2)

bt(cecily)=ab,

bt(henry)=a,

bt(fred)=?,

bt(kim)=a,

bt(bob)=b

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

+

Parameter Estimation – BLPs

bt

pc mc

Person

bt

pc mc

Person

mc

pc mc

Person

mother

Mother

mc

pc mc

Person

mother

Mother

pc

pc mc

Person

father

Father

pc

pc mc

Person

father

Father

bt/1

pc/1 mc/1

bt/1

pc/1 mc/1

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Model(2)

bt(cecily)=ab,

bt(henry)=a,

bt(fred)=?,

bt(kim)=a,

bt(bob)=b

Model(2)

bt(cecily)=ab,

bt(henry)=a,

bt(fred)=?,

bt(kim)=a,

bt(bob)=b

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

+

Parameter tying

EM – BLPs

bt

pc mc

Person

bt

pc mc

Person

mc

pc mc

Person

mother

Mother

mc

pc mc

Person

mother

Mother

pc

pc mc

Person

father

Father

pc

pc mc

Person

father

Father

bt/1

pc/1 mc/1

bt/1

pc/1 mc/1

Initial Parameters 0

Logic Program L

Expected counts of a clause

Expectation

Inference

Update parameters (ML, MAP)

Maximization

EM-algorithm:iterate until convergence

Current Model(k)

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Model(1)

pc(brian)=b,

bt(ann)=a,

bt(brian)=?,

bt(dorothy)=a

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Background

m(ann,dorothy),

f(brian,dorothy),

m(cecily,fred),

f(henry,fred),

f(fred,bob),

m(kim,bob),

...

Model(2)

bt(cecily)=ab,

bt(henry)=a,

bt(fred)=?,

bt(kim)=a,

bt(bob)=b

Model(2)

bt(cecily)=ab,

bt(henry)=a,

bt(fred)=?,

bt(kim)=a,

bt(bob)=b

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

Model(3)

pc(rex)=b,

bt(doro)=a,

bt(brian)=?

P( head(GI), body(GI) | DC )MM

DataCase

DC

Ground InstanceGI

P( head(GI), body(GI) | DC )MM

DataCaseDC

Ground InstanceGI

P( body(GI) | DC )MM

DataCaseDC

Ground InstanceGI

top related