Computational statistics, lecture 2 Basic simulation methodology Simulating multivariate distributions Simulating random sequences Importance sampling Antithetic sampling Quasi random numbers
Jan 12, 2016
Computational statistics, lecture 2
Basic simulation methodology
Simulating multivariate distributions
Simulating random sequences
Importance sampling
Antithetic sampling
Quasi random numbers
Computational statistics, lecture 2
Methods for simulating multivariate distributions
i. Transforming pseudo random numbers (PRNs) having a multivariate distribution that is easy to simulate
ii. Using factorization of the multivariate density into univariate density functions
iii. Using envelope-rejection techniques
Computational statistics, lecture 2
Illustrations of independent and dependent normal distributions
http://stat.sm.u-tokai.ac.jp/~yama/graphics/bnormE.html
Computational statistics, lecture 2
Theory of transforming a normal distribution to another
normal distribution
i. Let X be a random vector having a zero mean m-dimensional normal distribution with covariance matrix C
ii. Let Y = B X where B is an arbitrary k x m matrix
iii. Then Y has a k-dimensional zero mean normal distribution with covariance matrix B C BT
Computational statistics, lecture 2
Generating a bivariate normal distribution with a given
covariance matrix: method 1
i. Let Y be a random vector having a bivariate normal distribution with
covariance matrix C
ii. Then, C can be decomposed into a product C = B BT
iii. Furthermore, the random vector B X, where X has a standard
bivariate normal distribution, has a bivariate normal distribution with
covariance matrix C
Example:
11
12C
10
11B
Computational statistics, lecture 2
Generating a bivariate normal distribution with a given
covariance matrix: method 2
i. Let Y be a zero mean bivariate normal distribution with density
ii. Decompose the probability density into
iii. Note that the conditional distribution is normal with
Example:
11
12C
)1()|(
/)|(22
212
12112
YYVar
YYYE
)|()(),( 12|121),( 12121yyfyfyyf YYYYY
),( 21),( 21yyf YY
Computational statistics, lecture 2
Random number generation:method 3 - the envelope-rejection method
Generate x from a probability density g(x) such that cg(x) f(x) where c is a constant
Draw u from a uniform distribution on (0,1)
Accept x if u < f(x)/cg(x)
***************************Justification:
Let X denote a random numberfrom the probability density g. Then
How can we generate normally distributed random numbers?
0.00
0.20
0.40
0.60
0.80
1.00
1.20
-6 -4 -2 0 2 4 6
x
f(x)
cg(x)
c
tfh
tcg
tftgh
XhtXtP
)(
)(
)()(
)accepted} is {(
Computational statistics, lecture 2
Simulation of random sequences
Example 1: Random walk
Example 2: Autoregressive process
Note: A burn-in period is needed
-30
-25
-20
-15
-10
-5
0
5
1 13 25 37 49 61 73 85 97 109
-6
-4
-2
0
2
4
1 13 25 37 49 61 73 85 97 109
1 1- ,1 wherennn YY
nnn YY 1
Computational statistics, lecture 2
Simulating rare events by shifting the probability mass
to the event region
Assume that we would like to estimate pt = P(X > t) where X is a random variable with density f(x)
Let f* be an alternate probability density
Then
and we can estimate pt by computing
where Xi has density f*
)(11
ˆ1
)( i
K
itXt Xw
Kp
i
)(*
)()( )},(1{*
)(*)(*
)(1
}1{)(
)(
)(
)(
xf
xfxwXwE
dxxfxf
xf
EtXPp
tX
tx
tXt
where
Computational statistics, lecture 2
Simulating rare events by scaling or translation
Assume that we would like to estimate p = P(X > t) where X has probability density f(x)
Scaling:
Translation:
0),()(* ccxfxf
)/(
)()(
)/(1
)(*
axf
xfaxw
axfa
xf
Computational statistics, lecture 2
Simulating rare events by scaling: a simple example
Assume that we would like to estimate p = P(X > 4) where X has a standard normal distribution.
Let f* be the probability density of 10X
Then
and we can estimate pt by computing
where Xi is normal with mean zero and standard deviation 10
)(11
ˆ1
)( i
K
itXt Xw
Kp
i
)2
100/exp(10
10/)10/(
)(
)(*
)()(
22 xx
x
x
xf
xfxw
Computational statistics, lecture 2
Antithetic sampling
Use the same sequence of underlying random variates to generate
a second sample in such a way that the estimate of the quantity of
interest from the second sample will be negatively correlated with
the estimate from the original sample.
Computational statistics, lecture 2
Antithetic sampling – a simple example
Use Monte-Carlo simulation to estimate the integral
How can we apply the principle of antithetic sampling?
1
0
2 )1( dxxx
Computational statistics, lecture 2
Quasi random numbers
(minimal discrepancy sequences)
Quasi-random numbers give up serial independence of subsequently generated values in order to obtain as uniform as possible coverage of the domain
This avoids clusters and voids in the pattern of a finite set of selected points
http://www.puc-rio.br/marco.ind/quasi_mc.html
Computational statistics, lecture 2
Pseudo and quasi random numbers
Pseudo random numbers Quasi random numbers