Top Banner
PROBABILISTIC INFERENCE
64

P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

Dec 15, 2015

Download

Documents

Addison Parrott
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

PROBABILISTIC INFERENCE

Page 2: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

AGENDA

Conditional probability Independence Intro to Bayesian Networks

Page 3: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

REMEMBER: PROBABILITY NOTATION LEAVES VALUES IMPLICIT

P(AB) = P(A)+P(B)- P(AB) means

P(A=a B=b) = P(A=a) + P(B=b)- P(A=a B=b)

For all aVal(A) and bVal(B)

A and B are random variables.A=a and B=b are events.

Random variables indicate many possible combinations of events

Page 4: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CONDITIONAL PROBABILITY

P(A,B) = P(A|B) P(B)= P(B|A) P(A)

P(A|B) is the posterior probability of A given knowledge of B

Axiomatic definition:P(A|B) = P(A,B)/P(B)

Page 5: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CONDITIONAL PROBABILITY

P(A|B) is the posterior probability of A given knowledge of B

“For each value of b: given that I know B=b, what do I believe about A?”

If a new piece of information C arrives, the agent’s new belief (if it obeys the rules of probability) should be P(A|B,C)

Page 6: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CONDITIONAL DISTRIBUTIONS

State P(state)

C, T, P 0.108

C, T, P 0.012

C, T, P 0.072

C, T, P 0.008

C, T, P 0.016

C, T, P 0.064

C, T, P 0.144

C, T, P 0.576

P(Cavity|Toothache) = P(CavityToothache)/P(Toothache) =

(0.108+0.012)/(0.108+0.012+0.016+0.064) = 0.6

Interpretation: After observing Toothache, the patient is no longer an “average” one, and the prior probability (0.2) of Cavity is no longer valid

P(Cavity|Toothache) is calculated by keeping the ratios of the probabilities of the 4 cases of Toothache unchanged, and normalizing their sum to 1

Page 7: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

UPDATING THE BELIEF STATE

The patient walks into the dentists door

Let D now observe evidence E: Toothache holds with probability 0.8 (e.g., “the patient says so”)

How should D update its belief state?

State P(state)

C, T, P 0.108

C, T, P 0.012

C, T, P 0.072

C, T, P 0.008

C, T, P 0.016

C, T, P 0.064

C, T, P 0.144

C, T, P 0.576

Page 8: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

UPDATING THE BELIEF STATE

P(Toothache|E) = 0.8 We want to compute

P(CTP|E)= P(CP|T,E) P(T|E)

Since E is not directly related to the cavity or the probe catch, we consider that C and P are independent of E given T, hence:P(CP|T,E) = P(CP|T)

P(CTP|E)= P(CPT) P(T|E)/P(T)

State P(state)

C, T, P 0.108

C, T, P 0.012

C, T, P 0.072

C, T, P 0.008

C, T, P 0.016

C, T, P 0.064

C, T, P 0.144

C, T, P 0.576

Page 9: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

UPDATING THE BELIEF STATE

P(Toothache|E) = 0.8 We want to compute

P(CTP|E)= P(CP|T,E) P(T|E)

Since E is not directly related to the cavity or the probe catch, we consider that C and P are independent of E given T, hence:P(CP|T,E) = P(CP|T)

P(CTP|E)= P(CPT) P(T|E)/P(T)

State P(state)

C, T, P 0.108

C, T, P 0.012

C, T, P 0.072

C, T, P 0.008

C, T, P 0.016

C, T, P 0.064

C, T, P 0.144

C, T, P 0.576

These rows should be scaled to sum to 0.8

These rows should be scaled to sum to 0.2

Page 10: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

UPDATING THE BELIEF STATE

P(Toothache|E) = 0.8 We want to compute

P(CTP|E)= P(CP|T,E) P(T|E)

Since E is not directly related to the cavity or the probe catch, we consider that C and P are independent of E given T, hence:P(CP|T,E) = P(CP|T)

P(CTP|E)= P(CPT) P(T|E)/P(T)

State P(state)

C, T, P 0.108 0.432

C, T, P 0.012 0.048

C, T, P 0.072 0.018

C, T, P 0.008 0.002

C, T, P 0.016 0.064

C, T, P 0.064 0.256

C, T, P 0.144 0.036

C, T, P 0.576 0.144

These rows should be scaled to sum to 0.8

These rows should be scaled to sum to 0.2

Page 11: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

ISSUES

If a state is described by n propositions, then a belief state contains 2n states (possibly, some have probability 0)

Modeling difficulty: many numbers must be entered in the first place

Computational issue: memory size and time

Page 12: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

INDEPENDENCE OF EVENTS

Two events A=a and B=b are independent if P(A=a B=b) = P(A=a) P(B=b)

hence P(A=a|B=b) = P(A=a) Knowing B=b doesn’t give you any

information about whether A=a is true

Page 13: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

INDEPENDENCE OF RANDOM VARIABLES

Two random variables A and B are independent if

P(A,B) = P(A) P(B)

hence P(A|B) = P(A) Knowing B doesn’t give you any information

about A

[This equality has to hold for all combinations of values that A and B can take on, i.e., all events A=a and B=b are independent]

Page 14: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

SIGNIFICANCE OF INDEPENDENCE

If A and B are independent, then P(A,B) = P(A) P(B)

=> The joint distribution over A and B can be defined as a product of the distribution of A and the distribution of B

Rather than storing a big probability table over all combinations of A and B, store two much smaller probability tables!

To compute P(A=a B=b), just look up P(A=a) and P(B=b) in the individual tables and multiply them together

Page 15: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CONDITIONAL INDEPENDENCE

Two random variables A and B are conditionally independent given C, if

P(A, B|C) = P(A|C) P(B|C)

hence P(A|B,C) = P(A|C) Once you know C, learning B doesn’t give

you any information about A

[again, this has to hold for all combinations of values that A,B,C can take on]

Page 16: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

SIGNIFICANCE OF CONDITIONAL INDEPENDENCE

Consider Rainy, Thunder, and RoadsSlippery Ostensibly, thunder doesn’t have anything

directly to do with slippery roads… But they happen together more often when it

rains, so they are not independent… So it is reasonable to believe that Thunder

and RoadsSlippery are conditionally independent given Rainy

So if I want to estimate whether or not I will hear thunder, I don’t need to think about the state of the roads, just whether or not it’s raining!

Page 17: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

Toothache and PCatch are independent given Cavity, but this relation is hidden in the numbers! [Quiz]

Bayesian networks explicitly represent independence among propositions to reduce the number of probabilities defining a belief state

State P(state)

C, T, P 0.108

C, T, P 0.012

C, T, P 0.072

C, T, P 0.008

C, T, P 0.016

C, T, P 0.064

C, T, P 0.144

C, T, P 0.576

Page 18: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

BAYESIAN NETWORK Notice that Cavity is the “cause” of both Toothache

and PCatch, and represent the causality links explicitly

Give the prior probability distribution of Cavity Give the conditional probability tables of Toothache

and PCatchCavity

Toothache PCatch

5 probabilities, instead of 7

P(CTP) = P(TP|C) P(C) = P(T|C) P(P|C) P(C)

Cavity Cavity

P(T|C) 0.6 0.1

P(Cavity)

0.2

Cavity

Cavity

P(P|C) 0.9 0.02

Page 19: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CONDITIONAL PROBABILITY TABLES

Cavity

Toothache

P(Cavity)

0.2

Cavity Cavity

P(T|C) 0.6 0.1

PCatch

Columns sum to 1

If X takes n values, just store n-1 entries

P(CTP) = P(TP|C) P(C) = P(T|C) P(P|C) P(C)

Cavity Cavity

P(T|C) 0.6 0.1

P(T|C) 0.4 0.9

Cavity

Cavity

P(P|C) 0.9 0.02

Page 20: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

SIGNIFICANCE OF CONDITIONAL INDEPENDENCE

Consider Grade(CS101), Intelligence, and SAT Ostensibly, the grade in a course doesn’t

have a direct relationship with SAT scores but good students are more likely to get good

SAT scores, so they are not independent… It is reasonable to believe that Grade(CS101)

and SAT are conditionally independent given Intelligence

Page 21: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

BAYESIAN NETWORK Explicitly represent independence among

propositions Notice that Intelligence is the “cause” of both Grade

and SAT, and the causality is represented explicitly

Intel.

Grade

P(I=x)

high 0.3

low 0.7

SAT

6 probabilities, instead of 11

P(I,G,S) = P(G,S|I) P(I) = P(G|I) P(S|I) P(I)

P(G=x|I) I=low I=high

‘A’ 0.2 0.74

‘B’ 0.34 0.17

‘C’ 0.46 0.09

P(S=x|I) I=low I=high

low 0.95 0.2

high 0.05 0.8

Page 22: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

SIGNIFICANCE OF BAYESIAN NETWORKS

If we know that variables are conditionally independent, we should be able to decompose joint distribution to take advantage of it

Bayesian networks are a way of efficiently factoring the joint distribution into conditional probabilities

And also building complex joint distributions from smaller models of probabilistic relationships

But… What knowledge does the BN encode about the

distribution? How do we use a BN to compute probabilities of

variables that we are interested in?

Page 23: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

A MORE COMPLEX BN

Burglary Earthquake

Alarm

MaryCallsJohnCalls

causes

effects

Directed acyclic graph

Intuitive meaning of arc from x to y:

“x has direct influence on y”

Page 24: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

Size of the CPT for a node with k parents: 2k

A MORE COMPLEX BN

10 probabilities, instead of 31

Page 25: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

WHAT DOES THE BN ENCODE?

Each of the beliefs JohnCalls and MaryCalls is independent of Burglary and Earthquake given Alarm or Alarm

Burglary Earthquake

Alarm

MaryCallsJohnCalls

For example, John doesnot observe any burglariesdirectly

P(BJ) P(B) P(J)P(BJ|A) = P(B|A) P(J|A)

Page 26: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

WHAT DOES THE BN ENCODE?

The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm

For instance, the reasons why John and Mary may not call if there is an alarm are unrelated

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(BJ|A) = P(B|A) P(J|A)P(JM|A) = P(J|A) P(M|A)P(BJ|A) = P(B|A) P(J|A)P(JM|A) = P(J|A) P(M|A)

A node is independent of its non-descendants given its parents

A node is independent of its non-descendants given its parents

Page 27: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

WHAT DOES THE BN ENCODE?Burglary Earthquake

Alarm

MaryCallsJohnCallsA node is

independent of its non-descendants given its parents

A node is independent of its non-descendants given its parents

Burglary and Earthquake are independent

Burglary and Earthquake are independent

The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm

For instance, the reasons why John and Mary may not call if there is an alarm are unrelated

Page 28: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

LOCALLY STRUCTURED WORLD

A world is locally structured (or sparse) if each of its components interacts directly with relatively few other components

In a sparse world, the CPTs are small and the BN contains much fewer probabilities than the full joint distribution

If the # of entries in each CPT is bounded by a constant, i.e., O(1), then the # of probabilities in a BN is linear in n – the # of propositions – instead of 2n for the joint distribution

Page 29: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

EQUATIONS INVOLVING RANDOM VARIABLES GIVE RISE TO CAUSALITY RELATIONSHIPS

C = A B C = max(A,B) Constrains joint probability P(A,B,C) Nicely encoded as causality relationship

C

A B

Conditional probability given by equation rather than a CPT

Page 30: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

NAÏVE BAYES MODELS

P(Cause,Effect1,…,Effectn)= P(Cause) Pi P(Effecti | Cause)

Cause

Effect1 Effect2 Effectn

Page 31: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

BAYES’ RULE AND OTHER PROBABILITY MANIPULATIONS

P(A,B) = P(A|B) P(B)= P(B|A) P(A)

P(A|B) = P(B|A) P(A) / P(B)

Gives us a way to manipulate distributions e.g. P(B) = Sa P(B|A=a) P(A=a) Can derive P(A|B), P(B) using only P(B|A) and

P(A)

Page 32: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

NAÏVE BAYES CLASSIFIER

P(Class,Feature1,…,Featuren)= P(Class) Pi P(Featurei | Class)

Class

Feature1 Feature2 Featuren

P(C|F1,….,Fk) = P(C,F1,….,Fk)/P(F1,….,Fk)

= 1/Z P(C) Pi P(Fi|C)

Given features, what class?

Spam / Not Spam

English / French/ Latin

Word occurrences

Page 33: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

BUT DOES A BN REPRESENT A BELIEF STATE?

IN OTHER WORDS, CAN WE COMPUTE THE FULL JOINT DISTRIBUTION OF THE PROPOSITIONS FROM IT?

Page 34: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CALCULATION OF JOINT PROBABILITY

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

P(JMABE) = ??

Page 35: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

P(JMABE)= P(JM|A,B,E) P(ABE)= P(J|A,B,E) P(M|A,B,E) P(ABE)(J and M are independent given A)

P(J|A,B,E) = P(J|A)(J and B and J and E are independent given A)

P(M|A,B,E) = P(M|A) P(ABE) = P(A|B,E) P(B|E) P(E)

= P(A|B,E) P(B) P(E)(B and E are independent)

P(JMABE) = P(J|A)P(M|A)P(A|B,E)P(B)P(E)

Burglary Earthquake

Alarm

MaryCallsJohnCalls

Page 36: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CALCULATION OF JOINT PROBABILITY

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

P(JMABE)= P(J|A)P(M|A)P(A|B,E)P(B)P(E)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062

Page 37: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CALCULATION OF JOINT PROBABILITY

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

P(JMABE)= P(J|A)P(M|A)P(A|B,E)P(B)P(E)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062

P(x1x2…xn) = Pi=1,…,nP(xi|parents(Xi))

full joint distribution table

Page 38: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

CALCULATION OF JOINT PROBABILITY

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

P(x1x2…xn) = Pi=1,…,nP(xi|parents(Xi))

full joint distribution table

P(JMABE)= P(J|A)P(M|A)P(A|B,E)P(b)P(e)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062

Since a BN defines the full joint distribution of a set of propositions, it represents a belief state

Since a BN defines the full joint distribution of a set of propositions, it represents a belief state

Page 39: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

PROBABILISTIC INFERENCE

Assume we are given a Bayes net Usually we aren’t interested in calculating

the probability of a setting of all of the variables For some variables we observe their values

directly: observed variables For others we don’t care: nuisance variables

How can we enforce observed values and ignore nuisance variables, while strictly adhering to the rules of probability? Probabilistic inference problems

Page 40: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

PROBABILITY MANIPULATION REVIEW…

Three fundamental operations Conditioning Marginalization Applying (conditional) independence

assumptions

Page 41: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

TOP-DOWN INFERENCESuppose we want to compute P(Alarm)

Page 42: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

TOP-DOWN INFERENCESuppose we want to compute P(Alarm)1. P(Alarm) = Σb,e P(A,b,e)2. P(Alarm) = Σb,e P(A|b,e)P(b)P(e)

Page 43: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

TOP-DOWN INFERENCESuppose we want to compute P(Alarm)1. P(Alarm) = Σb,e P(A,b,e)2. P(Alarm) = Σb,e P(A|b,e)P(b)P(e)3. P(Alarm) = P(A|B,E)P(B)P(E) +

P(A|B, E)P(B)P(E) +P(A|B,E)P(B)P(E) +P(A|B,E)P(B)P(E)

Page 44: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

TOP-DOWN INFERENCESuppose we want to compute P(Alarm)1. P(A) = Σb,e P(A,b,e)2. P(A) = Σb,e P(A|b,e)P(b)P(e)3. P(A) = P(A|B,E)P(B)P(E) +

P(A|B, E)P(B)P(E) +P(A|B,E)P(B)P(E) +P(A|B,E)P(B)P(E)

4. P(A) = 0.95*0.001*0.002 +0.94*0.001*0.998 +0.29*0.999*0.002 +0.001*0.999*0.998= 0.00252

Page 45: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

TOP-DOWN INFERENCENow, suppose we want to compute P(MaryCalls)

Page 46: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

TOP-DOWN INFERENCENow, suppose we want to compute P(MaryCalls)1. P(M) = P(M|A)P(A) + P(M| A) P(A)

Page 47: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

TOP-DOWN INFERENCENow, suppose we want to compute P(MaryCalls)1. P(M) = P(M|A)P(A) + P(M| A) P(A)2. P(M) = 0.70*0.00252 + 0.01*(1-0.0252)

= 0.0117

Page 48: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

QUERYING THE BN

The BN gives P(T|C) What about P(C|T)?

Cavity

Toothache

P(C)

0.1

C P(T|C)

TF

0.40.01111

Page 49: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

BAYES’ RULE

P(AB) = P(A|B) P(B)= P(B|A) P(A)

So… P(A|B) = P(B|A) P(A) / P(B)

A convenient way to manipulate probability equations

Page 50: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

APPLYING BAYES’ RULE Let A be a cause, B be an effect, and let’s say we

know P(B|A) and P(A) (conditional probability tables)

What’s P(B)?

Page 51: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

APPLYING BAYES’ RULE Let A be a cause, B be an effect, and let’s say we

know P(B|A) and P(A) (conditional probability tables)

What’s P(B)? P(B) = Sa P(B,A=a) [marginalization]

P(B,A=a) = P(B|A=a)P(A=a) [conditional probability]

So, P(B) = Sa P(B | A=a) P(A=a)

Page 52: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

APPLYING BAYES’ RULE Let A be a cause, B be an effect, and let’s say we

know P(B|A) and P(A) (conditional probability tables)

What’s P(A|B)?

Page 53: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

APPLYING BAYES’ RULE Let A be a cause, B be an effect, and let’s say we

know P(B|A) and P(A) (conditional probability tables)

What’s P(A|B)? P(A|B) = P(B|A)P(A)/P(B) [Bayes

rule] P(B) = Sa P(B | A=a) P(A=a) [Last

slide] So, P(A|B) = P(B|A)P(A) / [Sa P(B | A=a) P(A=a)]

Page 54: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

HOW DO WE READ THIS?

P(A|B) = P(B|A)P(A) / [Sa P(B | A=a) P(A=a)] [An equation that holds for all values A can take on,

and all values B can take on] P(A=a|B=b) =

Page 55: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

HOW DO WE READ THIS?

P(A|B) = P(B|A)P(A) / [Sa P(B | A=a) P(A=a)] [An equation that holds for all values A can take on,

and all values B can take on] P(A=a|B=b) = P(B=b|A=a)P(A=a) /

[Sa P(B=b | A=a) P(A=a)]

Are these the same a?

Page 56: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

HOW DO WE READ THIS?

P(A|B) = P(B|A)P(A) / [Sa P(B | A=a) P(A=a)] [An equation that holds for all values A can take on,

and all values B can take on] P(A=a|B=b) = P(B=b|A=a)P(A=a) /

[Sa P(B=b | A=a) P(A=a)]

Are these the same a?

NO!

Page 57: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

HOW DO WE READ THIS?

P(A|B) = P(B|A)P(A) / [Sa P(B | A=a) P(A=a)] [An equation that holds for all values A can take on,

and all values B can take on] P(A=a|B=b) = P(B=b|A=a)P(A=a) /

[Sa’ P(B=b | A=a’) P(A=a’)]

Be careful about indices!

Page 58: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

QUERYING THE BN

The BN gives P(T|C) What about P(C|T)? P(Cavity|T=t)

= P(Cavity T=t)/P(T=t)= P(T=t|Cavity) P(Cavity) / P(T=t)[Bayes’ rule]

Querying a BN is just applying Bayes’ rule on a larger scale… algorithms next time

Cavity

Toothache

P(C)

0.1

C P(T|C)

TF

0.40.01111

Page 59: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

MORE COMPLICATED SINGLY-CONNECTED BELIEF NET

Radio

Battery

SparkPlugs

Starts

Gas

Moves

Page 60: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

SOME APPLICATIONS OF BN

Medical diagnosis Troubleshooting of hardware/software

systems Fraud/uncollectible debt detection Data mining Analysis of genetic sequences Data interpretation, computer vision, image

understanding

Page 61: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

Region = {Sky, Tree, Grass, Rock}

R2

R4R3

R1

Above

Page 62: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

BN to evaluate insurance risks

Page 63: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

PURPOSES OF BAYESIAN NETWORKS

Efficient and intuitive modeling of complex causal interactions

Compact representation of joint distributions O(n) rather than O(2n)

Algorithms for efficient inference with given evidence (more on this next time)

Page 64: P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.

HOMEWORK

Read R&N 14.1-3