Introduction Multivariate Distributions Marginal Distributions Conditional Distributions ending Prbability Theory and Mathematical Statistics Lecture 05: Probability Distributions and Probability Densities II Multivariate Probabilities Chih-Yuan Hung School of Economics and Management Dongguan University of Technology March 27, 2019
51
Embed
Prbability Theory and Mathematical Statistics Lecture 05: Probability Distributions ... · 2019. 3. 27. · Introduction Multivariate Distributions Marginal Distributions Conditional
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Two caplets are selected at random from a bottle containing 3aspirin, 2 sedative, and 4 laxative caplets. If X and Y are,respectively, the numbers of aspirin and sedative caplets includedamong the 2 caplets drawn from the bottle, find the probabilitiesassociated with all possible pairs of values of X and Y .
It is preferable to express the probabilities by means of a functionwith the values f (x , y) = P(X = x ,Y = y) for any pair of valuesof (x , y) within the range of X and Y .The example above, we can have
If X and Y are discrete random variables, the function given byf (x , y) = P(X = x ,Y = y) for each pair of values (x , y) withinthe range of X and Y is called the joint probability distributionof X and Y .
A bivariate function can serve as the joint probability distributionof a pair of discrete random variables X and Y if and only if itsvalues, f (x , y), satisfy the conditions
1 f (x , y) ≥ 0 for each pair of values (x , y) within its domain;
2 ∑x
∑yf (x , y) = 1, where the double summation extends over all
If we want to know that the probability that the values of tworandom variables are less than or equal to some real numbers xand y .
Definition (Joint Distribution Function)
If X and Y are discrete random variables, the function given by
F (x , y) = P(X ≤ x ,Y ≤ y) = ∑s≤x
∑t≤y
f (s, t)
for −∞ < x < ∞;−∞ < y < ∞
where f (s, t) is the value of the joint probability distribution of Xand Y at (s, t), is called the joint distribution function, or thejoint cumulative distribution of X and Y .
If F (x , y) is the value of the joint distribution function of twodiscrete random variables X and Y at (x, y), show that If a < band c < d , then F (a, c) ≤ F (b, d).
A bivariate function with values f (x , y) defined over the xy -planeis called a joint probability density function of the continuousrandom variables X and Y if and only if
A bivariate function can serve as a joint probability densityfunction of a pair of continuous random variables X and Y if itsvalues, f (x , y), satisfy the conditions
In Example 12 we derived the joint probability distribution of tworandom variables X and Y , the number of aspirin caplets and thenumber of sedative caplets included among two caplets drawn atrandom from a bottle containing three aspirin, two sedative, andfour laxative caplets.Find the probability distribution of X alone and that of Y alone.
Conditional Distribution If f (x , y) is the value of the jointprobability distribution of the discrete random variables X and Yat (x , y) and h(y) is the value of the marginal distribution of Y aty , the function given by
f (x |y) = f (x , y)
h(y)h(y) 6= 0
for each x within the range of X is called the conditionaldistribution of X given Y = y. Correspondingly, if g(x) is thevalue of the marginal distribution of X at x , the function given by
w(y |x) = f (x , y)
g(x)g(x) 6= 0
for each y within the range of Y is called the conditionaldistribution of Y given X = x.
Conditional Density If f (x , y) is the value of the joint density ofthe continuous random variables X and Y at (x , y) and h(y) isthe value of the marginal density of Y at y , the function given by
f (x |y) = f (x , y)
h(y)h(y) 6= 0
for −∞ < x < ∞, is called the conditional density of X given Y= y. Correspondingly, if g(x) is the value of the marginal densityof X at x , the function given by
w(y |x) = f (x , y)
g(x)g(x) 6= 0
for −∞ < y < ∞, is called the conditional density of Y given X= x.
Then, substituting into the formula for a conditional density, we get
f (x |y) = f (x , y)
h(y)=
4xy
2y= 2x
for 0 < x < 1, and f (x |y) = 0 elsewhere.
When we are dealing with two or more random variables, questionsof independence are usually of great importance.Here, f (x |y) = 2x does not depend on the given value of Y = y ,but this is clearly not the case in Example 24
f (x |y) = 2x + 4y
1 + 4y
Thus we have the analogy definition of independent randomvariables and its probability distributions.
If f (x1, x2, ..., xn) is the value of the joint probability distributionof the discrete random variables X1,X2, ...,Xn at (x1, x2, ..., xn)and fi (xi ) is the value of the marginal distribution of Xi at xi fori = 1, 2, ..., n, then the n random variables are independent if andonly if
Considering n independent flips of a balanced coin, let Xi be thenumber of heads (0 or 1) obtained in the ith flip for i = 1, 2, ..., n.Find the joint probability distribution of these n random variables.
Solution
fi (xi ) =1
2
n random variable are independent, the joint probabilitydistribution is given by