Dec 18, 2015

Welcome message from author

This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript

KETIDAKPASTIAN (UNCERTAINTY)

Yeni Herdiyeni – http://ilkom.fmipa.ipb.ac.id/yeni

“Uncertainty is defined as the lack of the exact knowledge that would enable us to reach a perfectly reliable conclusion”

Topik

Pengertian Ketidakpastian (Uncertainty) Konsep Dasar Peluang (Probability) Faktor kepastian (Certainty Factor) Penalaran Bayes (Bayesian Reasoning) Fuzzy Neural Network Genetic Algorithm

Ketidakpastian

Hujan atau Tidak ya….???

Expert System in Nursing

For example, a nursing diagnosis may require to know if a patient had suffered from a certain allergy?

The patient may not know or remember the answer to this question. Therefore, the patient information is unavailable to ensure a correct diagnosis.

This means that decisions will often be based on incomplete or uncertain data. Clearly, this may result in uncertain conclusions.

Ide

Konsep Dasar Peluang (Probability)

Faktor kepastian (Certainty Factor)

Penalaran Bayes (Bayesian Reasoning)

Fuzzy Neural Network Genetic Algorithm

Aplikasi

Financial Market : Stock Market Games : Gambling Weather Forecasts Risk Management dll

Studi Kasus

PASS : An Expert System with Certainty Factors for Predicting Student Success

Variabel : Jenis Kelamin, spesialisasi, Peringkat

Peringkat: Moderate : >=10 and <12.5 Good : >=12.5 and <15.5 Very Good: >=15.5 and <18.5 Excellent: >=18.5 and <=20

Ketidakpastian

Informasi seringkali tidak lengkap, tidak konsisten, tidak pasti (uncertain)

Weak Implication Imprecise language : ambigu, imprecise Unknown data Menggabungkan perbendaan pandangan

dari pakar

9

Sources of Uncertain KnowledgeImpreciselanguage

Term

AlwaysVery oftenUsuallyOftenGenerallyFrequentlyRather oftenAbout as often as notNow and thenSometimesOccasionallyOnce in a whileNot oftenUsually notSeldomHardly everVery seldomRarelyAlmost neverNever

Mean value

99888578787365502020201513101076530

Term

AlwaysVery oftenUsuallyOften

GenerallyFrequentlyRather often

About as often as notNow and thenSometimesOccasionallyOnce in a whileNot oftenUsually notSeldomHardly everVery seldomRarelyAlmost neverNever

Mean value

10087797474727250342928221616987520

Milton Hakel (1968)Ray Simpson (1944)

Uncertainty

In evidence: 'if pressure gauge is high then liquid is

boiling'

Electrical components are notoriously faulty, the pressure gauge may actually be stuck

In inferring conclusion: 'if patient has a sore throat then patient has

tonsillitis'

A doctor may infer such a conclusion but it would not be an absolute, binary, one.

Vagueness of language: 'if the car is a Porsch then it is fast'.

What do we mean when using the term fast?

Representasi Ketidakpastian

Certainty Theory Probabilistic Theory of evidence (Bayes Theorm) Fuzzy logic Neural Network GA

Certainty Theory

Certainty Theory

A certainty factor (cf ), a number to measure the expert’s belief.

Maximum value of the certainty factor is +1.0 (definitely true) and minimum -1.0 (definitely false).

Two aspects: Certainty in the Evidence

The evidence can have a certainty factor attached

Certainty in Rule Note CF values are not probabilities, but informal

measures of confidence

Certainty Factors

Term Certainty Factor

Definitely not -1.0

Almost certainly not -0.8

Probably not -0.6

Maybe not -0.4

Unknown -0.2 to +0.2

Maybe +0.4

Probably +0.6

Almost certainly +0.8

Definitely +1.0

Uncertain term and their intepretation

Expert Systems with Certainty Theory

In expert systems with certainty factors, the knowledge base consists of a set of rules that have the following syntax:

IF <evidence>THEN <hypothesis> {cf }

Where cf represents belief in hypothesis H given that evidence E has occurred.

The certainty factor assigned by a rule is propagated through the reasoning chain

Now belief in hypothesis H given that evidence E has occurred, is: cf (H,E) = cf (E) * cf(Rule)

For example:IF sky is clearTHEN the forecast is sunny {cf 0.8}

and current certainty factor of “sky is clear” is 0.5 cf (H,E) = 0.5 * 0.8 = 0.4

Example

The certainty that the forcast is sunny = 0.4

Multiple Antecedents

So if a certainty is attached to the evidence What happens when we have rules with

more than one piece of evidence? With conjunctions (i.e. and)

Use the minimum cf of evidence With disjunctions (i.e. or)

Use the maximum cf of evidence

Conjunctive Antecedents - Example Conjunctive Rules:

cf(H, E1 and E2 and …Ei) = min{c(E1 , E2 , …Ei )} *cf(Rule)

IF there are dark clouds E1

AND the wind is stronger E2

THEN it will rain {cf 0.8}

So assume that cf(E1) = 0.5 and cf(E2) = 0.9, then

cf(H, E) = min{0.5, 0.9} * 0.8 = 0.4

Disjunctive Antecedents - Example Disjunctive Rules:

cf(H, E1 or E2 or …Ei) = max{cf(E1 , E2 , …Ei )} * cf(Rule)

IF there are dark clouds E1

OR the wind is stronger E2

THEN it will rain {cf 0.8}

Again assume that cf(E1) = 0.5 and cf(E2) = 0.9, then

cf(H, E) = max{0.5, 0.9} * 0.8 = 0.72

What if two pieces of evidence leads to the same conclusion?

Common sense suggests that, if two pieces of evidence from different sources support the same hypothesis then confidence in this hypothesis should increase more than if only one piece of evidence had been obtained.

Similarly Concluded Rules

Rule 1: IF A is XTHEN C is Z {cf 0.8}

Rule 2: IF B is YTHEN C is Z {cf 0.6}

Similarly Concluded Rules - Example

Two rules can lead to same conclusion:

IF weatherperson predicts rain E1

THEN it will rain{cf 1 0.7}

IF farmer predicts rain E2

THEN it will rain{cf 2 0.9}

Assume cf(E1) = 1.0 and that cf(E2 ) = 1.0

cf 1(H1, E1) = cf(E1) * cf(Rule1)

cf 1(H1, E1) = 1.0 * 0.7 = 0.7

cf 2 (H2, E2) = cf(E2) * cf(Rule2)

cf 2(H2, E2) = 1.0 * 0.9 = 0.9

Similarly Concluded Rules

If we obtain supporting evidence for hypothesis from two sources, then should feel more confident in conclusion

So:

cf combine(cf 1, cf 2) = cf 1 + cf 2 * (1 - cf 1)

= 0.7 + 0.9 * ( 1 – 0.7)

= 0.97

Certainty Theory

So certainty theory allows us to reason with uncertainty in: evidence inferring conclusion – i.e. the hypothesis

But what about vagueness of language: 'if the car is a Porsch then it is fast'.

For this we use Fuzzy Logic

Probability

21st September 2006Bogdan L. Vrusias © 2006

25

Basic Probability Theory

The concept of probability has a long history that goes back thousands of years when words like “probably”, “likely”, “maybe”, “perhaps” and “possibly” were introduced into spoken languages. However, the mathematical theory of probability was formulated only in the 17th century.

The probability of an event is the proportion of cases in which the event occurs. Probability can also be defined as a scientific measure of chance.

26

Basic Probability Theory

Probability can be expressed mathematically as a numerical index with a range between zero (an absolute impossibility) to unity (an absolute certainty).

Most events have a probability index strictly between 0 and 1, which means that each event has at least two possible outcomes: favourable outcome or success, and unfavourable outcome or failure.

outcomes possible of number the

successesof number thesuccessP

outcomes possible of number the

failures of number thefailureP

27

Basic Probability Theory

If s is the number of times success can occur, and f is the number of times failure can occur, then

and p + q = 1

If we throw a coin, the probability of getting a head will be equal to the probability of getting a tail. In a single throw, s = f = 1, and therefore the probability of getting a head (or a tail) is 0.5.

fs

spsuccessP

fs

fqfailureP

The Axioms of Probability

0 <= P(A) <= 1 P(True) = 1 P(False) = 0 P(A or B) = P(A) + P(B) - P(A and B)

Theorems from the Axioms

0 <= P(A) <= 1, P(True) = 1, P(False) = 0 P(A or B) = P(A) + P(B) - P(A and B)

From these we can prove:P(not A) = P(~A) = 1-P(A)

A

B

P(A or B)

BP(A and B)

Copyright © Andrew W. Moore

Another important theorem

0 <= P(A) <= 1, P(True) = 1, P(False) = 0 P(A or B) = P(A) + P(B) - P(A and B)

From these we can prove:P(A) = P(A ^ B) + P(A ^ ~B)

How?

A

B

P(A or B)

BP(A and B)

31

Conditional Probability

Let A be an event in the world and B be another event. Suppose that events A and B are not mutually exclusive,

but occur conditionally on the occurrence of the other. The probability that event A will occur if event B occurs is

called the conditional probability. Conditional probability is denoted mathematically as p(A|B)

in which the vertical bar represents "given" and the complete probability expression is interpreted as “Conditional probability of event A occurring given that event B

has occurred”.

occur can B times of number the

occur can B and A times of number theBAp

Copyright © Andrew W. Moore

Conditional Probability

P(A|B) = Fraction of worlds in which B is true that also have A true

F

H

H = “Have a headache”F = “Coming down with Flu”

P(H) = 1/10P(F) = 1/40P(H|F) = 1/2

“Headaches are rare and flu is rarer, but if you’re coming down with ‘flu there’s a 50-50 chance you’ll have a headache.”

Copyright © Andrew W. Moore

Conditional Probability

F

H

H = “Have a headache”F = “Coming down with Flu”

P(H) = 1/10P(F) = 1/40P(H|F) = 1/2

P(H|F) = Fraction of flu-inflicted worlds in which you have a headache

= #worlds with flu and headache ------------------------------------ #worlds with flu

= Area of “H and F” region ------------------------------ Area of “F” region

= P(H ^ F) ----------- P(F)

Copyright © Andrew W. Moore

Definition of Conditional Probability

P(A ^ B) P(A|B) = ----------- P(B)

Corollary: The Chain Rule

P(A ^ B) = P(A|B) P(B)

Copyright © Andrew W. Moore

Probabilistic Inference

F

H

H = “Have a headache”F = “Coming down with Flu”

P(H) = 1/10P(F) = 1/40P(H|F) = 1/2

One day you wake up with a headache. You think: “Drat! 50% of flus are associated with headaches so I must have a 50-50 chance of coming down with flu”

Is this reasoning good?

Copyright © Andrew W. Moore

Probabilistic Inference

F

H

H = “Have a headache”F = “Coming down with Flu”

P(H) = 1/10P(F) = 1/40P(H|F) = 1/2

P(F ^ H) = …

P(F|H) = …

21st September 2006Bogdan L. Vrusias © 2006

37

Conditional Probability

The number of times A and B can occur, or the probability that both A and B will occur, is called the joint probability of A and B. It is represented mathematically as p(AB). The number of ways B can occur is the probability of B, p(B), and thus

Similarly, the conditional probability of event B occurring given that event A has occurred equals

Bp

BApBAp

Ap

ABpABp

38

Conditional Probability

Hence

And

Substituting the last equation into the equation

yields the Bayesian rule:

ApABpABp

ApABpBAp

Bp

BApBAp

Copyright © Andrew W. Moore

What we just did…

P(A ^ B) P(A|B) P(B)

P(B|A) = ----------- = ---------------

P(A) P(A)

This is Bayes RuleBayes, Thomas (1763) An essay towards solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Society of London, 53:370-418

Copyright © Andrew W. Moore

Another way to understand the intuition

Thanks to Jahanzeb Sherwani for contributing this explanation:

Papers

Uncertainty Management in Expert System

PASS : An Expert System with Certainty Factors for Predicting Student Success

FuzzyCLIPS

Next Week

Bayesian Inference Fuzzy

Related Documents