3/7/2012 IF-UTAMA 1 Jurusan Teknik Informatika – Universitas Widyatama IF-UTAMA 1 IF-UTAMA 1 Reasoning (First Order Logic) Pertemuan : 6 Dosen Pembina : Sriyani Violina Danang Junaedi Susetyo Bagas Baskoro IF-UTAMA 2 IF-UTAMA 2 • Deskripsi • First-order logic (review) – Properties, relations, functions, quantifiers, … – Terms, sentences, wffs, axioms, theories, proofs, … • Extensions to first-order logic • Logical agents – Reflex agents – Representing change: situation calculus, frame problem – Preferences on actions – Goal-based agents Overview IF-UTAMA 3 IF-UTAMA 3 • Pertemuan ini mempelajari bagaimana memecahkan suatu masalah dengan teknik reasoning. • Metode reasoning yang dibahas pada pertemuan ini adalah First Order logic Deskripsi IF-UTAMA 4 IF-UTAMA 4 First-order logic • First-order logic (FOL) models the world in terms of – Objects, which are things with individual identities – Properties of objects that distinguish them from other objects – Relations that hold among sets of objects – Functions, which are a subset of relations where there is only one “value” for any given “input” • Examples: – Objects: Students, lectures, companies, cars ... – Properties: blue, oval, even, large, ... – Relations: Brother-of, bigger-than, outside, part-of, has-color, occurs-after, owns, visits, precedes, ... – Functions: father-of, best-friend, second-half, one-more-than ...
15
Embed
Jurusan Teknik Informatika –Universitas Widyatama Overview · Jurusan Teknik Informatika –Universitas Widyatama IF-UTAMA 1 Reasoning (First Order Logic) Pertemuan : 6 Dosen Pembina
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
3/7/2012
IF-UTAMA 1
Jurusan Teknik Informatika – Universitas Widyatama
• Constant symbols, which represent individuals in the world– Mary– 3– Green
• Predicate symbols, which map individuals to truth values
– greater(5,3)
– green(Grass)
– color(Grass, Green)
IF-UTAMA 6
Fungsi
• Function symbols, which map individuals to
individuals
– father-of(Mary) = John
– color-of(Sky) = Blue
–
• Fungsi juga dapat digunakan bersamaan
dengan predikat. Contoh:
IF-UTAMA 7IF-UTAMA 7
FOL Provides
• Variable symbols
– E.g., x, y, foo
• Connectives
– Same as in PL: not (¬), and (∧), or (∨), implies
(⇒), if and only if (biconditional ⇔)
• Quantifiers
– Universal ∀∀∀∀x or (Ax)
– Existential ∃∃∃∃x or (Ex)
IF-UTAMA 8
Contoh
• Diketahui dua buah statement sebagai berikut
1. Jono menyukai Rebeca
2. Dani juga menyukai Rebeca
• Dari kedua statement tersebut di atas FOL:
1. Suka(Jono,Rebeca)
2. Suka(dani,Rebeca)
• Pada 2 predikat di atas terdapat 2 orang yang menyukai Rebeca. Untuk
memberikan pernyataan adanya kecemburuan diantara mereka maka
dibuat sebuah statement jika x menyukai y dan z juga menyukai y maka
x dan z tidak akan saling menyukai, atau dalam FOL
– Suka(x,y) ∧ Suka(z,y) ⇒ ¬Suka(x,z)
• Dari predicate calculus ini, pengetahuan yang tersirat adalah: jika ada
dua orang yang menyukai orang yang sama, maka kedua orang tersebut
pasti tidak saling suka (saling membenci)
3/7/2012
IF-UTAMA 3
IF-UTAMA 9IF-UTAMA 9
Sentences are built from terms and atoms
• A term (denoting a real-world individual) is a constant symbol, a variable symbol, or an n-place function of n terms. x and f(x1, ..., xn) are terms, where each xi is a term.
A term with no variables is a ground term
• An atomic sentence (which has value true or false) is an n-place predicate of n terms
• A complex sentence is formed from atomic sentences connected by the logical connectives:¬P, P∨Q, P∧Q, P ⇒ Q, P ⇔ Q where P and Q are sentences
• A quantified sentence adds quantifiers ∀ and ∃
• A well-formed formula (wff) is a sentence containing no “free” variables. That is, all variables are “bound” by universal or existential quantifiers.
(∀x)P(x,y) has x bound as a universally quantified variable, but y is free.
X is above Y iff X is on directly on top of Y or there is a pile of one or more other objects directly on top of one another starting with X and ending with Y.
Systems that reason with causal rules are called model-based
reasoning systems
– Diagnostic rules infer the presence of hidden properties directly
from the percept-derived information. We have already seen two
diagnostic rules:
(∀ l,s) At(Agent,l,s) ∧ Breeze(s) ⇒ Breezy(l)
(∀ l,s) At(Agent,l,s) ∧ Stench(s) ⇒ Smelly(l)
IF-UTAMA 40IF-UTAMA 40
Representing change:
The frame problem
• Frame axioms: If property x doesn’t change as a
result of applying action a in state s, then it stays the
same.
– On (x, z, s) ∧ Clear (x, s) ⇒
On (x, table, Result(Move(x, table), s)) ∧
¬On(x, z, Result (Move (x, table), s))
– On (y, z, s) ∧ y≠ x ⇒ On (y, z, Result (Move (x, table),
s))
– The proliferation of frame axioms becomes very
cumbersome in complex domains
3/7/2012
IF-UTAMA 11
IF-UTAMA 41IF-UTAMA 41
The frame problem II
• Successor-state axiom: General statement that characterizes every way in which a particular predicate can become true:– Either it can be made true, or it can already be true and not be
changed:
– On (x, table, Result(a,s)) ⇔[On (x, z, s) ∧ Clear (x, s) ∧ a = Move(x, table)] ∧[On (x, table, s) ∧ a ≠ Move (x, z)]
• In complex worlds, where you want to reason about longer chains of action, even these types of axioms are too cumbersome– Planning systems use special-purpose inference methods to reason
about the expected state of the world at any point in time during a multi-step plan
IF-UTAMA 42IF-UTAMA 42
Qualification problem
• Qualification problem:
– How can you possibly characterize every single effect of
an action, or every single exception that might occur?
– When I put my bread into the toaster, and push the
button, it will become toasted after two minutes, unless…
• The toaster is broken, or…
• The power is out, or…
• I blow a fuse, or…
• A neutron bomb explodes nearby and fries all electrical
components, or…
• A meteor strikes the earth, and the world we know it ceases to
exist, or…
IF-UTAMA 43IF-UTAMA 43
Ramification problem
• Similarly, it’s just about impossible to characterize every
side effect of every action, at every possible level of detail:
– When I put my bread into the toaster, and push the button, the bread
will become toasted after two minutes, and…
• The crumbs that fall off the bread onto the bottom of the toaster over
tray will also become toasted, and…
• Some of the aforementioned crumbs will become burnt, and…
• The outside molecules of the bread will become “toasted,” and…
• The inside molecules of the bread will remain more “breadlike,” and…
• The toasting process will release a small amount of humidity into the air
because of evaporation, and…
• The heating elements will become a tiny fraction more likely to burn out
the next time I use the toaster, and…
• The electricity meter in the house will move up slightly, and…
IF-UTAMA 44IF-UTAMA 44
Knowledge engineering!
• Modeling the “right” conditions and the “right” effects at the “right” level of abstraction is very difficult
• Knowledge engineering (creating and maintaining knowledge bases for intelligent reasoning) is an entire field of investigation
• Many researchers hope that automated knowledge acquisition and machine learning tools can fill the gap:
– Our intelligent systems should be able to learn about the conditions and effects, just like we do!
– Our intelligent systems should be able to learn when to pay attention to, or reason about, certain aspects of processes, depending on the context!
3/7/2012
IF-UTAMA 12
IF-UTAMA 45IF-UTAMA 45
Preferences among actions
• A problem with the Wumpus world knowledge base that we have built so far is that it is difficult to decide which action is best among a number of possibilities.
• For example, to decide between a forward and a grab, axioms describing when it is OK to move to a square would have to mention glitter.
• This is not modular!
• We can solve this problem by separating facts about actions from facts about goals. This way our agent can be reprogrammed just by asking it to achieve different goals.
IF-UTAMA 46IF-UTAMA 46
Preferences among actions
• The first step is to describe the desirability of
actions independent of each other.
• In doing this we will use a simple scale: actions can
• We use this action quality scale in the following way.
• Until it finds the gold, the basic strategy for our agent is:
– Great actions include picking up the gold when found and climbing
out of the cave with the gold.
– Good actions include moving to a square that’s OK and hasn't been
visited yet.
– Medium actions include moving to a square that is OK and has
already been visited.
– Risky actions include moving to a square that is not known to be
deadly or OK.
– Deadly actions are moving into a square that is known to have a pit
or a Wumpus.
IF-UTAMA 48IF-UTAMA 48
Goal-based agents
• Once the gold is found, it is necessary to change strategies.
So now we need a new set of action values.
• We could encode this as a rule:
– (∀s) Holding(Gold,s) ⇒ GoalLocation([1,1]),s)
• We must now decide how the agent will work out a sequence
of actions to accomplish the goal.
• Three possible approaches are:
– Inference: good versus wasteful solutions
– Search: make a problem with operators and set of states
– Planning: to be discussed later
3/7/2012
IF-UTAMA 13
IF-UTAMA 49
Example: Hoofers Club
• Problem Statement: Tony, Shi-Kuo and Ellen belong to
the Hoofers Club. Every member of the Hoofers Club is
either a skier or a mountain climber or both. No mountain
climber likes rain, and all skiers like snow. Ellen dislikes
whatever Tony likes and likes whatever Tony dislikes. Tony
likes rain and snow.
• Query: Is there a member of the Hoofers Club who is a
mountain climber but not a skier?
IF-UTAMA 50
Example: Hoofers Club…
• Translation into FOL Sentences
• Let S(x) mean x is a skier, M(x) mean x is a mountain climber, and L(x,y) mean x likes y, where the domain of the first variable is Hoofers Club members, and the domain of the second variable is snow and rain. We can now translate the above English sentences into the following FOL wffs:
1. (Ax) S(x) v M(x)
2. ¬¬¬¬(Ex) M(x) ^ L(x, Rain)
3. (Ax) S(x) => L(x, Snow)
4. (Ay) L(Ellen, y) <=> ¬¬¬¬ L(Tony, y)
5. L(Tony, Rain)
6. L(Tony, Snow)
7. Query: (Ex) M(x) ^ ¬¬¬¬ S(x)
8. Negation of the Query: ¬¬¬¬(Ex) M(x) ^ ¬¬¬¬ S(x)
IF-UTAMA 51
Example: Hoofers Club…
• Conversion to Clause Form1. S(x1) v M(x1)
2. ¬¬¬¬ M(x2) v ¬¬¬¬ L(x2, Rain)
3. ¬¬¬¬ S(x3) v L(x3, Snow)
4. ¬¬¬¬ L(Tony, x4) v ¬¬¬¬ L(Ellen, x4)
5. L(Tony, x5) v L(Ellen, x5)
6. L(Tony, Rain)
7. L(Tony, Snow)
8. Negation of the Query: ¬¬¬¬ M(x7) v S(x7)
IF-UTAMA 52
Example: Hoofers Club…
• Resolution Refutation Proof
Clause 1 Clause 2 Resolvent MGU (i.e., Theta)
8 1 9. S(x1) {x7/x1}
9 3 10. L(x1, Snow) {x3/x1}
10 411. ¬¬¬¬ L(Tony,
Snow)
{x4/Snow,
x1/Ellen}
11 7 12. False {}
3/7/2012
IF-UTAMA 14
IF-UTAMA 53
Penjelasan
• Clause 8, Clause 1 : ¬¬¬¬M(x7) v S(x7), S(x1) v M(x1) menghasilkan S(x1) dimana x1=x7 {unit resolution A ∨∨∨∨ B, ¬¬¬¬B hasilnya A} …(9)
• Clause 9, Clause 3 : S(x1), ¬¬¬¬S(x3) v L(x3, Snow) menghasilkan L(x1, Snow) dimana x1=x3 {unit resolution A ∨∨∨∨ B, ¬¬¬¬B hasilnya A} …(10)
• Clause 10, Clause 4 : L(x1, Snow) , ¬¬¬¬L(Tony, x4) v ¬¬¬¬L(Ellen, x4) dimana x1=Elen dan x4=Snow makaL(Elen, Snow) , ¬¬¬¬L(Tony, Snow) v ¬¬¬¬L(Ellen, Snow)menghasilkan ¬¬¬¬L(Tony, Snow) {unit resolution A ∨∨∨∨
B, ¬¬¬¬B hasilnya A} …(11)
• Clause 11, Clause 7 : ¬¬¬¬L(Tony, Snow), L(Tony, Snow) tidak menghasilkan kalimat apapun {unit resolution A ∨∨∨∨ B, ¬¬¬¬B hasilnya A}
• Kesimpulan : -
IF-UTAMA 54
Example: Hoofers Club…
• Answer Extraction
• Answer to the query: Ellen!
Clause 1 Clause
2
Resolvent MGU (i.e.,
Theta)
¬¬¬¬M(x7)v S(x7)
v(M(x7)^
¬¬¬¬S(x7))
1 9. S(x1) v (M(x1)
^ ¬¬¬¬ S(x1))
{x7/x1}
9 3 10. L(x1, Snow) v
(M(x1) ^ ¬¬¬¬S(x1))
{x3/x1}
10 4 11. ¬¬¬¬ L(Tony,
Snow) v (M(Ellen)
^ ¬¬¬¬S(Ellen))
{x4/Snow,
x1/Ellen}
11 7 12. M(Ellen) ^
¬¬¬¬S(Ellen)
{}
IF-UTAMA 55
Penjelasan
• Clause 8, Clause 1 : ¬¬¬¬M(x7)v S(x7) v (M(x7)^ ¬¬¬¬S(x7)), S(x1) v M(x1) menghasilkan S(x1) v (M(x1)^ ¬¬¬¬S(x1)) dimana x1=x7 {unit resolution A ∨∨∨∨ B, ¬¬¬¬B hasilnya A} …(9)
• Clause 9, Clause 3 : S(x1) v (M(x1)^ ¬¬¬¬S(x1)), ¬¬¬¬S(x3) v L(x3, Snow) menghasilkan L(x1, Snow) v (M(x1) ^ ¬¬¬¬S(x1)) dimana x1=x3 {unit resolution A ∨∨∨∨
B, ¬¬¬¬B hasilnya A} …(10)
• Clause 10, Clause 4 : L(x1, Snow) v (M(x1) ^ ¬¬¬¬S(x1)), ¬¬¬¬L(Tony, x4) v ¬¬¬¬L(Ellen, x4) dimanax1=Elen dan x4=Snow maka L(Elen, Snow) v (M(Elen) ^ ¬¬¬¬S(Elen)), ¬¬¬¬L(Tony, Snow) v ¬¬¬¬L(Ellen, Snow)menghasilkan (M(Elen) ^ ¬¬¬¬S(Elen))v ¬¬¬¬L(Tony, Snow) {unit resolution A ∨∨∨∨ B, ¬¬¬¬B hasilnya A} …(11)
• Clause 11, Clause 7 : (M(Elen) ^ ¬¬¬¬S(Elen)) v ¬¬¬¬L(Tony, Snow) , L(Tony, Snow) menghasilkan M(Elen) ^ ¬¬¬¬S(Elen) {unit resolution A ∨∨∨∨ B, ¬¬¬¬B hasilnya A}
• Kesimpulan : Elen
Latihan Praktikum V
1. Tom n Jerry
– Problem Statement: Tom adalah seekor kucing, Jerry adalah tikus.
Tom dan Jerry adalah tokoh kartun. Semua kucing dan tikus adalah
binatang. Semua kucing menyukai atau membenci tikus. Setiap
mahluk hanya mencoba melukai sesuatu yang mereka tidak suka
– Query: Apakah Tom mencoba melukai Jerry?
2. Lop-lopan
– Problem Statement: Everyone who loves all animal is loved by
someone. Anyone who kills an animal is loved by no one. Jack
loves all animals. Either Jack or Curiosity killed the cat, who is
named Tuna.
– Query: Did Curiosity kill the cat?
• Ubah Problem statement di atas ke dalam FOL dan bwt klausanya,
kemudian lakukan Resolution Refutation Proof
• Sifat: Kelompok(3-5 orang). Deadline: 14 Maret 2012
IF-UTAMA 56
3/7/2012
IF-UTAMA 15
Tugas Rumah V
• Cari/bwt impelementasi yang menerapkan
First Order Logic, kemudian kirimkan hasil
analisisnya ke imel yang telah ditentukan
• Sifat: Kelompok (3-5 orang)
• Deadline: 14 Maret 2012 jam 24:00 waktu
mail server
IF-UTAMA 57 IF-UTAMA 58IF-UTAMA 58
Referensi
1. Suyanto.2007.”Artificial Intelligence” .Informatika. Bandung
2. Andreas Geyer-Schulz, Chuck Dyer.2005. “Propositional and First-Order Logic”. -
3. Yeni Kustiyaningsih.2010. “Kecerdasan Buatan-pertemuan 6 REPRESENTASI PENGETAHUAN -LOGIKA-[online]”.url: http://yenikustiyahningsih.files.wordpress.com/2010/10/pertemuan-5.ppt.Tanggal Akses: 9 Februari 2011