BAYESIAN INFERENCE Sampling techniques Andreas Steingötter
Feb 22, 2016
BAYESIAN INFERENCE Sampling techniques
BAYESIAN INFERENCE Sampling techniquesAndreas SteingtterMotivation & BackgroundExact inference is intractable, so we have to resort to some form of approximation
Motivation & Backgroundvariational Bayes deterministic approximation not exact in principle
Alternative approximation:Perform inference by numerical sampling, also known as Monte Carlo techniques.
Motivation & Background
Motivation & Background
Classical Monte Carlo approx
approximationMotivation & Background
How to do sampling?Basic Sampling algorithmsRestricted mainly to 1- / 2- dimensional problems
Markov chain Monte CarloVery general and powerful framework
Basic sampling
Random samplingComputers can generate only pseudorandom numbersCorrelation of successive valuesLack of uniformity of distributionPoor dimensional distribution of output sequenceDistance between where certain values occur are distributed differently from those in a random sequence distribution
Random sampling from theUniform DistributionAssumption: good pseudo-random generator for uniformly distributed data is implemented
Alternative: http://www.random.orgtrue random numbers with randomness coming from atmospheric noise
Random sampling from a standard non-uniform distributionRandom sampling from a standard non-uniform distribution
Rejection sampling
Rejection samplingRejection sampling
Adaptive rejection sampling Adaptive rejection sampling
Slope
Offset k Adaptive rejection sampling Importance sampling
Importance sampling
Importance sampling
Importance sampling Markov Chain Monte Carlo (MCMC) sampling Markov Chain Monte Carlo (MCMC) sampling
MCMC - Metropolis algorithm
MCMC - Metropolis algorithm
Metropolis algorithm Examples: Metropolis algorithm Implementation in R:Elliptical distibution
Examples: Metropolis algorithm Implementation in R:Initialization [-2,2], step size = 0.3
n=1500
n=15000Examples: Metropolis algorithm Implementation in R:Initialization [-2,2], step size = 0.5
n=1500
n=15000Examples: Metropolis algorithm Implementation in R:Initialization [-2,2], step size = 1
n=1500
n=15000Validation of MCMChomogeneousz(1)z(2)z(m)z(m+1)
Invariant(stationary)
Validation of MCMChomogeneousdetailed balance
Invariant(stationary)
SufficientreversibleValidation of MCMCergodicityinvariantProperties and validation of MCMC
k - Mixing coefficients
Metropolis-Hastings algorithm
If symmetryMetropolis-Hastings algorithm Gaussian centered on current stateSmall variance -> high acceptance, slow walk, dependent samplesLarge variance -> high rejection rate
Gibbs sampling
repeated by cycling randomly choose variable to be updatedGibbs sampling
Gibbs samplingz(1)z(2)z(3)Gibbs samplingObtain m independent samples:Sample MCMC during a burn-in period to remove dependence on initial valuesThen, sample at set time points (e.g. every Mth sample)The Gibbs sequence converges to a stationary (equilibrium) distribution that is independent of the starting values, By construction this stationary distribution is the target distribution we are trying to simulate.
Gibbs sampling