1 Proof Methods for Propositional Logic Russell and Norvig Chapter 7 CS440 Fall 2015 1 Logical equivalence Two sentences are logically equivalent iff they are true in the same models: α ≡ ß iff α╞ β and β╞ α CS440 Fall 2015 2 Validity and satisfiability A sentence is valid (a tautology) if it is true in all models e.g., True, A ∨¬A, A ⇒ A, (A ∧ (A ⇒ B)) ⇒ B Validity is connected to inference via the Deduction Theorem: KB ╞ α if and only if (KB ⇒ α) is valid A sentence is satisfiable if it is true in some model e.g., A ∨ B A sentence is unsatisfiable if it is false in all models e.g., A ∧ ¬A Satisfiability is connected to inference via the following: KB ╞ α if and only if (KB ∧¬α) is unsatisfiable (known as proof by contradiction) CS440 Fall 2015 3 Inference rules Modus Ponens A ⇒ B, A B Example: “raining implies soggy courts”, “raining” Infer: “soggy courts” CS440 Fall 2015 4
12
Embed
Proof Methods for Propositional Logic · Propositional Logic Russell and Norvig Chapter 7 CS440 Fall 2015 1 Logical equivalence ! Two sentences are logically equivalent iff they are
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
Proof Methods for Propositional Logic
Russell and Norvig Chapter 7
CS440 Fall 2015 1
Logical equivalence
n Two sentences are logically equivalent iff they are true in the same models: α ≡ ß iff α╞ β and β╞ α
CS440 Fall 2015 2
Validity and satisfiability A sentence is valid (a tautology) if it is true in all models
e.g., True, A ∨¬A, A ⇒ A, (A ∧ (A ⇒ B)) ⇒ B
Validity is connected to inference via the Deduction Theorem: KB ╞ α if and only if (KB ⇒ α) is valid
A sentence is satisfiable if it is true in some model e.g., A ∨ B
A sentence is unsatisfiable if it is false in all models e.g., A ∧ ¬A
Satisfiability is connected to inference via the following: KB ╞ α if and only if (KB ∧¬α) is unsatisfiable (known as proof by contradiction)
CS440 Fall 2015 3
Inference rules
n Modus Ponens A ⇒ B, A B Example: “raining implies soggy courts”, “raining” Infer: “soggy courts”
CS440 Fall 2015 4
2
Example
n KB: {A⇒B, B ⇒C, A} n Is C entailed? n Yes.
Given Rule Inferred
A⇒B, A Modus Ponens B
B ⇒C, B Modus Ponens C
CS440 Fall 2015 5
Inference rules (cont.)
n Modus Tollens A ⇒ B, ¬B ¬A Example: “raining implies soggy courts”, “courts not soggy” Infer: “not raining”
n And-elimination A ∧ B A
CS440 Fall 2015 6
Reminder: The Wumpus World
n Performance measure q gold: +1000, death: -1000 q -1 per step, -10 for using the arrow
n Environment q Squares adjacent to wumpus: smelly q Squares adjacent to pit: breezy q Glitter iff gold is in the same square q Shooting kills wumpus if you are facing it q Shooting uses up the only arrow q Grabbing picks up gold if in same square
n Sensors: Stench, Breeze, Glitter, Bump n Actuators: Left turn, Right turn, Forward, Grab, Release,
n Recall that when we were at (2,1) we could not decide on a safe move, so we backtracked, and explored (1,2), which yielded ¬B1,2. This yields ¬P2,2 ∧ ¬P1,3
3. Move ¬ inwards using de Morgan's rules and double-negation: (¬B1,1 ∨ P1,2 ∨ P2,1) ∧ ((¬P1,2 ∧ ¬P2,1) ∨ B1,1)
4. Apply distributive law (∧ over ∨) and flatten: (¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬P1,2 ∨ B1,1) ∧ (¬P2,1 ∨ B1,1)
CS440 Fall 2015 14
Converting to CNF
n Every sentence can be converted to CNF 1. Replace α ⇔ β with (α ⇒ β) ∧(β ⇒ α) 2. Replace α ⇒ β with (¬ α v β) 3. Move ¬ “inward”
1. Replace ¬(¬ α) with α 2. Replace ¬(α ∧ β) with (¬ α v ¬ β) 3. Replace ¬(α v β) with (¬ α ∧ ¬ β)
4. Replace (α v (β ∧ γ)) with (α v β) ∧(α v γ)
CS440 Fall 2015 15
Converting to CNF (II)
n While converting expressions, note that q ((α v β) v γ) is equivalent to (α v β v γ) q ((α ∧ β) ∧ γ) is equivalent to (α ∧ β ∧ γ)
n Why does this algorithm work? q Because ⇒ and ⇔ are eliminated q Because ¬ is always directly attached to literals q Because what is left must be ∧’s and v’s, and they
can be distributed over to make CNF clauses
CS440 Fall 2015 16
5
Using resolution
n Even if our KB entails a sentence α, resolution is not guaranteed to produce α.
n To get around this we use proof by contradiction, i.e., show that KB∧¬α is unsatisfiable.
n Resolution is complete with respect to proof by contradiction.
Resolution yielded the empty clause. The empty clause is False (a disjunction is True only if
at least one of its disjuncts is true). CS440 Fall 2015 18
Automated Theorem Proving
n How do we automate the inference process? q Step 1: assume the negation of the consequent and add it
to the knowledge base q Step 2: convert KB to CNF
n i.e. a collection of disjunctive clauses
q Step 3: Repeatedly apply resolution until: n It produces an empty clause (contradiction), in which case the
consequent is proven, or n No more terms can be resolved, in which case the consequent
cannot be proven
CS440 Fall 2015 19
Resolution Algorithm
function PL-RESOLUTION(KB, α) returns true or false inputs: KB, the knowledge base, a sentence in propositional logic α, the query, a sentence in propositional logic clauses ← the set of clauses in the CNF representation of KB ∧ ¬α new ← {} loop do for each pair of clauses Ci,Cj in clauses do resolvents ← PL-RESOLVE(Ci,Cj) if resolvents contains the empty clause then return true new ← new U resolvents if new ⊆ clauses then return false clauses ← clauses U new
CS440 Fall 2015 20
6
Another example
n If it rains, I get wet. n If I’m wet, I get mad. n Given that I’m not mad, prove that it’s not
raining.
CS440 Fall 2015 21
Inference for Horn clauses Horn Form (special form of CNF)
KB = conjunction of Horn clauses Horn clause = disjunction of literals of which at most one is
positive Example: C ∨¬ B ∨¬A
Modus Ponens is a natural way to make inference in
Horn KBs (recall a⇒b is equivalent to ¬a ∨ b) α1, … ,αn, α1 ∧ … ∧ αn ⇒ β
β
n Successive application of modus ponens leads to algorithms that are sound and complete, and run in linear time
CS440 Fall 2015 22
Forward chaining
n Idea: fire any rule whose premises are satisfied in the KB q add its conclusion to the KB, until query is found
CS440 Fall 2015 23
And-or graph
Forward chaining example
CS440 Fall 2015 24
7
Forward chaining example
CS440 Fall 2015 25
Forward chaining example
CS440 Fall 2015 26
Forward chaining example
CS440 Fall 2015 27
Forward chaining example
CS440 Fall 2015 28
8
Forward chaining example
CS440 Fall 2015 29
Forward chaining example
CS440 Fall 2015 30
Forward chaining example
CS440 Fall 2015 31
Forward chaining algorithm Forward chaining is sound and complete for Horn KB function PL-FC-ENTAILS?(KB, q) returns true or false
inputs: KB, knowledge base, a set of propositional definite clauses q, the query, a propositional symbol count ← a table, where count[c] is the number of symbols in c’s premise inferred ← a table, where inferred[s] is initially false for all symbols agenda ← a queue of symbols, initially symbols known to be true in KB while agenda is not empty do p ← POP(agenda) if p=q then return true if inferred[p]=false then inferred[p] ← true for each clause c in KB where p is in c.PREMISE do decrement count[c] if count[c]=0 then add c.CONCLUSION to agenda return false
CS440 Fall 2015 32
9
Backward chaining
aka Goal Directed reasoning Idea: work backwards from the query q:
check if q is known already, or prove by backward chaining all premises of some rule
concluding q Avoid loops:
check if new subgoal is already on the goal stack
Avoid repeated work: check if new subgoal has already been proved true, or has already failed
CS440 Fall 2015 33
Backward chaining example
CS440 Fall 2015 34
Backward chaining example
CS440 Fall 2015 35
Backward chaining example
CS440 Fall 2015 36
10
Backward chaining example
CS440 Fall 2015 37
Backward chaining example
CS440 Fall 2015 38
Backward chaining example
CS440 Fall 2015 39
Backward chaining example
CS440 Fall 2015 40
11
Backward chaining example
CS440 Fall 2015 41
Backward chaining example
CS440 Fall 2015 42
Backward chaining example
CS440 Fall 2015 43
Forward vs. backward chaining
n FC is data-driven n May do lots of work that is irrelevant to the goal
n BC is goal-driven, appropriate for problem-solving, q e.g., What courses do I need to take to graduate? How do I
get into a PhD program?
n Complexity of BC can be much less than linear in size of KB
CS440 Fall 2015 44
12
Efficient propositional inference
Two families of efficient algorithms for satisfiability: n Complete backtracking search algorithms:
q DPLL algorithm (Davis, Putnam, Logemann, Loveland) n Local search algorithms
q WalkSAT algorithm: n Start with a random assignment n At each iteration pick an unsatisfied clause and pick a symbol in the
clause to flip; alternate between: q Pick the symbol that minimizes the number of unsatisfied clauses q Pick a random symbol
n Logical agents apply inference to a knowledge base to derive new information and make decisions
n Basic concepts of logic: q syntax: formal structure of sentences q semantics: truth of sentences wrt models q entailment: truth of one sentence given another q inference: deriving sentences from other sentences q soundness: derivations produce only entailed sentences q completeness: derivations can produce all entailed sentences
n Resolution is complete for propositional logic n Forward, backward chaining are linear-time, complete for Horn
clauses n Propositional logic lacks expressive power