Top Banner
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012
60

CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Jan 11, 2016

Download

Documents

Erick Clark
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

CS 4100 Artificial Intelligence

Prof. C. HafnerClass Notes Jan 26, 2012

Page 2: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Topics• More about assignment 3• Negation by failure and Horn Clause databases

– Closed world assumption• First order logic continued• Wumpus world model using FOL

Jan 31• A few more details about hw3

– Test data available– loadInitialKB and processPercepts functions

• Return an discuss assignments 1 and 2• Converting FOL sentences to Normal Form• Unification• Automated reasoning in FOL: resolution, forward chaining,

backward chaining

Page 3: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Sketch of Forward Chaining Algorithm (version 1 assumes the initial KB is fully expanded)

ALGORITHM (recursive):processPercepts(‘percepts file’) uses KBase -- a knowledge base of definite clauses for each new percept p

PLForwardChain(p) #use a recursive "helper function" PLForwardChain(percept) if percept is already in KBase, return else add percept to KBase for r in rules where conclusion of r is not already in KBase

if percept is a premise of r and all the other premises of r are known

PLForwardChain(conclusion of r)

Page 4: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Sketch of Forward Chaining Algorithm (version 2 drops this assumption)

processPercepts(‘percepts file’) uses KBase -- a knowledge base of definite clauses for each new percept p

PLForwardChain(p) #use a recursive "helper function" PLForwardChain(percept) if percept is already in KBase, return else add percept to KBase for r in rules where conclusion of r is not already in KBase

if percept is a premise of r and allTrue(premises of r) #all premises are known or provable

PLForwardChain(conclusion of r)

allTrue(premises) for p in premises

if p is already in Kbase continueif p is not provable return false

return true

Page 5: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Return to FOL: Meaning and truth (review)• Sentences of FOL are true with respect to a model and an

interpretation

• A model for a FOL language is a “world” of objects (domain elements) and relations among them (compare with propositional logic model)

• Interpretation I specifies referents forconstant symbols → objectspredicate symbols → relationsfunction symbols → functions

• For an atomic sentence, the interpretation I(P(term1,...,termn) )= true iff the objects I(term1)..., I(termn) are in the relation I(P)•

Page 6: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Meaning and truth in first-order logic (cont.)

• Complex sentences: truth is defined using the same truth tables: I(S1 S2) = true iff I (S1) = true and I (S2) = true.

• I(x [S]) = true iff for every object o in the model I (S[x/C]) = true whenI (C) = o

• I( x [S]) = is true iff there is at least one object o in the model such that: I (S[x/C]) = true whenI (C) = o

Page 7: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Models for FOL: Example

symbols: constant relation function

Page 8: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Quantification examples:

• <variables> <sentence>

Everyone at NU is smart:x [ At(x,NU) Smart(x) ]

• x P is true in a model m iff P is true with x being each possible object in the model

• Roughly speaking, equivalent to the conjunction of all instantiations of P

At(KingJohn,NU) Smart(KingJohn) At(Richard,NU) Smart(Richard) At(NUS,NU) Smart(NUS)

Page 9: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

A common mistake to avoid

• Typically, is the main connective with

• Common mistake: using as the main connective with :x At(x,NU) Smart(x)means “Everyone is at NU and everyone is smart”

Page 10: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Quantification examples (cont.)

• <variables> <sentence>

• Someone at NU is smart:• x [At(x,NU) Smart(x)]

• x P is true in a model m iff P is true with x being some possible object in the model

• Roughly speaking, equivalent to the disjunction of all instantiations of P

At(KingJohn,NU) Smart(KingJohn) At(Richard,NU) Smart(Richard) At(NU,NU) Smart(NU) ...

Page 11: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Another common mistake to avoid

• Typically, is the main connective with

• Common mistake: using as the main connective with :

x At(x,NU) Smart(x)is true if there is no one who at NU!

Page 12: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Properties of quantifiers• x y is the same as y x• x y is the same as y x • x y is not the same as y x• x y Loves(y, x)

– “There is a person who is loved by everyone in the world”

• x y Loves(y, x)– “Everyone in the world is loved by at least one person”

• Quantifier duality: each can be expressed using the other(cf DeMorgan laws)

• x Likes(x,IceCream) == x Likes(x,IceCream)

• x Likes(x,Broccoli) == x Likes(x,Broccoli)

Page 13: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Equality

• term1 = term2 is true under a given interpretation if and only if term1 and term2 refer to the same object

• E.g., definition of Sibling in terms of Parent:x,y Sibling(x,y) [(x = y) m,f [ (m = f) Parent(m,x) Parent(f,x) Parent(m,y) Parent(f,y)]]

• We will use a different notation for equality: =(x, y)– makes programming simpler

Page 14: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

A model M for the kinship domain• Individuals: J K L M N O P Q R• Functions: mom[1] : mom(N) M• Relations[arity]

– fem[1] = {M, Q}– par[2] = {[M, N], [N, R] . . }– sib[2] = {[M, O], [P, J], [J, P]}--------------------Interpretation I -----------------------

• Constants: John, Mary, Sue, Tom .. .. – I(Mary) = M, I(Sue) = Q, . . .

• Function symbol: Mother, I(Mother) = mom• Relation symbols: Female, Parent, Sibling

I(Female) = fem, I(Parent) = par, I(Sibling) = sib

Page 15: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Using FOLThe kinship domain:• Brothers are siblings

x,y Brother(x,y) Sibling(x,y)

• “Sibling” is symmetricx,y Sibling(x,y) Sibling(y,x)

• One's mother is one's female parentm,c =(Mother(c) , m) (Female(m) Parent(m,c))

• Some mothers are over 40 years old m, x [=(Mother(x), m) ^ > (Age(m), 40) ]

Page 16: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Knowledge Engineering: Choice of Representations

• Human(Bob) vs. ISA(Bob, Human)• Green(B21) vs. Color(B21, Green)

The choice affects the generality at which concepts can be expressed

Inheritance rule:x,y,z ISA(x, y) ^ ISA(y, z) ISA(x, z)

Two blocks are the same color:x Color(B21, x) ^ Color(B22, x)

Page 17: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Informal quiz on use of FOL to represent “common sense” knowledge

• All apples are red• Some apples are red (“some” means at least one)• All apples contain (some) worms• Some apples contain (some) worms• Every person is mortal• Every person is male or female (but not both)

Page 18: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Wumpus world in FOL

• First step: define constants, function symbols, predicate symbols to express the facts

• Percept(data, t) means: at step t, the agent perceived the data where data is a vector:– [Stench, Breeze, Glitter]– Ex: Percept([None, Breeze, None],2]

• At(Agent, s, t) means: agent is at square s at step t– Ex: At(Agent, [2,1], 2]

Page 19: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Some Wumpus axioms

Axiom to interpreting perceptions in context x,t At(Agent, x, t) ^ Breeze(t) Breezy(x)

Definitional axiom:s,g,t Percept([s, Breeze, g], t) Breeze(t)

Diagnostic Axiomx Breezy(x) z Adjacent(z, x) ^ Pit(z)

Causal Axiom:z Pit(z) (x Adjacent(z, x) Breezy(x))

World model axioms: Adjacent([1,1],[2,1]) etc. x,y Adjacent(x, y) Adjacent(y,x)

Page 20: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Interacting with FOL KBs• Suppose a wumpus-world agent is using an FOL KB and

perceives a smell and a breeze (but no glitter) at t=5:Tell(KB,Percept([Smell,Breeze,None],5)) – use forward chainingAsk(KB,a BestAction(a,5)) - use backward chainingBC Query: does the KB entail some best action at t=5?

• Answer: {a/Shoot} ← substitution (binding list)

• Given a sentence S and a substitution σ,• Sσ denotes the result of plugging σ into S; e.g.,

S = Smarter(x,y)σ = {x/Sue,y/Bill}Sσ = S {x/Sue,y/Bill} = Smarter(Sue,Bill)

• Ask(KB,S) returns σ such that KB╞ Sσ

Page 21: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Knowledge engineering in FOL

1. Identify the task2. Decide on a vocabulary of predicates, functions,

and constants (a logical language L)3. Encode general knowledge about the domain4. Encode a description of the specific problem

instance5. Pose queries to the inference procedure and get

answers6. Debug the knowledge base• Assemble the relevant knowledge

Page 22: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Negation by failure and CWA

A closed world is a world where every fact that is not known is false.

Real-world examples: databases

Query: Does American Airlines fly from Boston to Tampa?If no DB records of such flights answer NO

(consider a Horn clause KB of food “likes”)Query: Does Sam like cheeseburgers? not a known “fact” not provable by “rules” no way to prove he does not, but we say “NO”

Page 23: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Conversion to CNF

• Everyone who loves all animals is loved by someone:x [ y [Animal(y) Loves(x,y)] y Loves(y,x) ]

• 1. Eliminate biconditionals and implicationsx [ y [Animal(y) Loves(x,y) ] y Loves(y,x) ]

• 2. Move inwards: x p ≡ x p, x p ≡ x p

x [y [ (Animal(y) Loves(x,y))] y Loves(y,x) ] x [y [Animal(y) Loves(x,y) ] y Loves(y,x) ]

No more negated quantifiers

––

Page 24: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Conversion to CNF contd.3. Rename variables: each quantifier should use a

different one x [y [Animal(y) Loves(x,y) ] y Loves(y,x) ]

x1 [ y1 [ Animal(y1) Loves(x1,y1) ] y2 Loves(y2,x1) ]

4. Skolemize: Each existential variable is replaced by a Skolem function of the enclosing universally quantified variables:

5. Drop universal quantifiers: (Animal(F1(x)) Loves(x,F1(x))) Loves(F2(x),x)

6. Distribute over Animal(F1(x)) Loves(F2(x),x)Loves(x,F1(x) ) Loves(F2(x),x)

• x1 [ (Animal(F1(x1)) Loves(x1,F1(x1))) Loves(F2(x1),x1) ]

Page 25: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Class exercise

Page 26: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Inference in FOL – Chapter 9

• Theoretical foundations– Inference by universal and existential instantiation– Unification– Resolution viewed as Generalized Modus Ponens

• Practical implementation (forward and backward chaining)

Page 27: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Notation

A substitution is a set of variable-term pairs: {x/term, y/term, . . . }, often referred to using the

symbol θ [theta]. No variable can occur more than once.

For any term or formula A: Subst(θ, A) also written Aθ == the result of replacing each

variable in A with the corresponding term. A term is a constant symbol, a variable symbol, or a function symbol applied to 0 or more terms.

Def: A ground term is a term with no variablesDef: A ground sentence is a sentence with no free variables

Page 28: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Inference by Universal instantiation (UI)• Every instantiation of a universally quantified sentence is entailed

by it:v α

Subst({v/g}, α)

for any variable v and ground term g

• E.g., x King(x) Greedy(x) Evil(x) yields:King(John) Greedy(John) Evil(John)King(Richard) Greedy(Richard) Evil(Richard)King(Father(John)) Greedy(Father(John)) Evil(Father(John))...

Page 29: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Inference by Existential instantiation (EI)• For any sentence α, variable v, and constant symbol

k that does not appear elsewhere in the knowledge base:

v αSubst({v/k}, α)

• E.g., x Crown(x) OnHead(x,John) yields:

Crown(C1) OnHead(C1,John)

provided C1 is a new constant symbol, called a Skolem constant

••

Page 30: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

To implement universal instantiation

• All humans are mortal• Jack is human----------------------------• Jack is mortal

R1. x Human(x) Mortal(x)F1. Mortal(Jack)

Let R1 be p q, Let F1 be be p’Modus ponens for FOL: If p and p’ unify, conclude q’Unify means we can match p and p’ by a substitution.We then apply the same substitution to q to get q’

Page 31: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Unification• Def: Two formulas A and B unify if there is a

substitution θ such that Aθ = Bθ. • Ex: To unify A = Knows(John, x) and B = Knows(y, Mary)

θ = {y/John, x/Mary }• θ is not unique! • To unify A = Knows(John,x) and B = Knows(y,z), θ1 = {y/John, x/z } or θ2 = {y/John, x/Sue, z/Sue}• The first unifier is more general than the second.

• There is a single most general unifier (MGU) that is unique up to renaming of variables.MGU = { y/John, x/z } or {y/John, z/x }

–•

Page 32: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Most general unifier

• Def: If θ1 is a unifier for formulas A and B, it is a MOST GENERAL UNIFIER (MGU) iff:– There is no other unifier θ2 for A and B such that

Aθ1 subsumes Aθ2

• A formula F subsumes a formula G if there is a non-trivial substitution such that F = G

• A θ1 = Knows(John, z) Aθ2 = Knows(John, Sue) Aθ1 subsumes Aθ2 therefore θ2 = {y/John, x/Sue, z/Sue} is not

a MGU.Note: what is ??

Page 33: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Class Exercise: Unification Examples

p q θ Knows(John,x) Knows(John,Jane) Knows(John,x) Knows(y,Barak) Knows(John,x) Knows(y,Mother(y))Knows(John,x) Knows(x,Barak)

• Unify(α,β) = θ if αθ = βθ

Page 34: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Unification

• Unify(α,β) = θ if αθ = βθ p q θ Knows(John,x) Knows(John,Jane) {x/Jane}}Knows(John,x) Knows(y,Barak) Knows(John,x) Knows(y,Mother(y))Knows(John,x) Knows(x,Barak)

Page 35: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Unification

• Unify(α,β) = θ if αθ = βθ p q θ Knows(John,x) Knows(John,Jane) {x/Jane}}Knows(John,x) Knows(y,Barak) {x/Barak,y/John}}Knows(John,x) Knows(y,Mother(y))Knows(John,x) Knows(x,Barak)

Page 36: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Unification

•• Unify(α,β) = θ if αθ = βθ

p q θ Knows(John,x) Knows(John,Jane) {x/Jane}}Knows(John,x) Knows(y,Barak) {x/Barak,y/John}}Knows(John,x) Knows(y,Mother(y))

{y/John,x/Mother(John)}}Knows(John,x) Knows(x,Barak)

Page 37: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Unification•• Unify(α,β) = θ if αθ = βθ

p q θ Knows(John,x) Knows(John,Jane) {x/Jane}}Knows(John,x) Knows(y,Barak) {x/Barak,y/John}}Knows(John,x) Knows(y,Mother(y))

{y/John,x/Mother(John)}}Knows(John,x) Knows(x,Barak) {fail}

Page 38: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

The unification algorithm (Fig. 9.1)

Page 39: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

The unification algorithm (cont.)

Page 40: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Application to reasoningModus ponens says:

Given p q and pConclude: q

In FOL: Given p q and p’ (where p and p’ unify by θ)Conclude: qθ

Suppose KB includes: x King(x) Greedy(x) Evil(x) King(John)

This won’t quite work since we have p1 ^ p2 q

Page 41: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Generalized Modus Ponens (GMP)(follows from the resolution rule for FOL)

( p1 p2 … pn q), p1', p2', … , pn' qθ

p1' is King(John) p1 is King(x) p2' is Greedy(y) p2 is Greedy(x) θ is {x/John,y/John} q is Evil(x)

q θ is Evil(John)• GMP used with KB of definite clauses (exactly one positive literal)

• All variables assumed universally quantified

• How do we get Greedy(y) in our KB ?

where pi'θ = pi θ for all i

Page 42: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Soundness of GMP• Need to show that

p1', …, pn', (p1 … pn q) ╞ qθ

provided that pi'θ = piθ for all p

• Lemma: For any sentence p, we have p ╞ pθ by UI

– (p1 … pn q) ╞ (p1 … pn q)θ = (p1θ … pnθ qθ)

– p1', …, pn' (╞ p1' … pn‘ ) θ ╞ p1'θ … pn'θ

– From 1 and 2, qθ follows by ordinary Modus Ponens

Note: you should know the definitions of a sound inference procedures and a complete inference procedure.

Page 43: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Forward Chaining in FOL (with the “explicit knowledge” assumption)

• Assume percepts do not contain variables (may contain “generated symbol” constants)

• Example: you see an unfamiliar dog in the building:Percept: Isa(G33, Dog) Assume KB includes: Isa(x, Dog) Isa(x, Animal)

• Add new percept to KB if not already believed• If percept UNIFIES with a rule premise (by some θ) and

if all the other premises pθ are believed, add qθ to KB.• θ is {x/G33) and qθ is isa(G33, Animal)

Page 44: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward Chaining

• Given a definite clause KB and a query q’:– For any fact q in KB that unifies with q’, return θ (or

“YES” if θ = { } )– For any rule in KB whose conclusion q unifies with q’:

• If the rule’s premises p1 θ . . . pn θ can all be proved with a resulting substitution θ’, return COMPOSE(θ, θ’)

– If no facts or rules result in a substitution, return “NO”

• Note this can return multiple answers!

Page 45: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Simple examples• KB: Likes(John, Pizza)

Likes(Mary, Pizza)Likes(Sam, IceCream)

• Query: Likes(John, Pizza) return YES• Query: Likes(Sam, Pizza) return NO (justified by CWA)• Query: Likes(x, Pizza) return a list of substitutions

{ {x/John} , {x/Mary} }

Page 46: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

More complex examples

• Add to KB: Likes(y, Pizza) Likes(y, Spaghetti)• Query: Likes(John, Spaghetti)• Query: Likes(Sam, Spaghetti)• Query: Likes(x, Spaghetti)• Query: Likes(x, y)Set of substitutions:{ {x/John, y/Pizza}, {x/Mary, y/Pizza}, {x/Sam, y/IceCream},

{x/John, y/Spaghetti}, {x/Mary, y/Spaghetti} }

Page 47: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining algorithm

SUBST(COMPOSE(θ1, θ2), p) = SUBST(θ2, SUBST(θ1, p))

Page 48: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward Chaining: Example knowledge base

• The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, is an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American.

• Prove that Col. West is a criminal

Page 49: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Example knowledge base contd.... it is a crime for an American to sell weapons to hostile nations:

American(x) Weapon(y) Sells(x,y,z) Hostile(z) Criminal(x)Nono … has some missiles, i.e., x Owns(Nono,x) Missile(x):… all of its missiles were sold to it by Colonel West

Missile(x) Owns(Nono,x) Sells(West,x,Nono)Missiles are weapons:An enemy of America counts as "hostile“:

Enemy(x,America) Hostile(x)West, who is American …The country Nono, an enemy of America …

Enemy(Nono,America)• American(West)• Missile(x) Weapon(x)• Owns(Nono,M1) and Missile(M1)

Page 50: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 51: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 52: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 53: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 54: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 55: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 56: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 57: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Backward chaining example

Page 58: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Properties of backward chaining• Depth-first recursive proof search: space is linear in

size of proof

• Incomplete due to infinite loops– fix by checking current goal against every goal on stack

• Inefficient due to repeated subgoals (both success and failure)– fix using caching of previous results (extra space)

• Widely used for logic programming

Page 59: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Resolution: brief summary

• Full first-order version:

l1 ··· lk, m1 ··· mn

(l1 ··· li-1 li+1 ··· lk m1 ··· mj-1 mj+1 ··· mn) θ

where Unify(li, mj) = θ for some i, j• The two clauses are assumed to be standardized apart so that they share

no variables. For example,Rich(x) Unhappy(x)

Rich(Ken)Unhappy(Ken)

with θ = {x/Ken}• Apply resolution steps to CNF(KB α); complete for FOL

Page 60: CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 26, 2012.

Resolution proof: definite clauses