(Last adjustments: December 6, 2017) Workshop on statistical challenges in astronomy –Hierarchical models in Stan Presenter Dr. John T. Ormerod School of Mathematics & Statistics F07 University of Sydney (w) 02 9351 5883 (e) john.ormerod (at) sydney.edu.au
23
Embed
Workshop on statistical challenges in astronomy ... · easy to create statistical models ... (No-U-Turn sampler). ... estimates mass matrix and step size sampling: adapts number of
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
(Last adjustments: December 6, 2017)
Workshop on statistical challenges in astronomy–Hierarchical models in Stan
Presenter
Dr. John T. Ormerod
School of Mathematics & Statistics F07
University of Sydney
(w) 02 9351 5883
(e) john.ormerod (at) sydney.edu.au
Why MCMC?
2 Do you have data? (x)
2 Do you want to build a rich statistical model? (p(x|θ))
2 Perhaps you want to incorporate prior information? (p(θ))
2 Are the integrals to get posterior densities are intractable?
p(θ|x) = p(x|θ)p(θ)p(x)
=p(x,θ)∫p(x,θ)dθ
2 Are point estimates not adequate? (If not Variational Bayes might be for you).
Workshop on statistical challenges in astronomy 2
Review: MCMC
2 Markov Chain Monte Carlo. The samples form a Markov Chain.
2 Markov property:
p(θt+1|θt, . . . , θ1) = p(θt+1|θt)
2 Invariant distribution:
πP = π
2 Detailed balance: sufficient condition:
P(θt+1,A) =∫Aq(θt+1, θt)dy
π(θt+1)q(θt+1, θt) = π(θt)q(θt, θt+1)
2 A Markov chain satisfying the detailed balance will converge in distribution to
π(θ).
2 We want to design Markov chains to mimic posterior distributions.
Workshop on statistical challenges in astronomy 3
Review: Random Walk Metropolis Hastings
2 Want samples from posterior distribution: p(θ|x) ∝ p(x,θ).
2 Algorithm: (Suppose θ ∈ Rd)
◦ Suppose we have x, p(x,θ) and θ0. Choose B ∈ Rd×d
◦ Loop t = 1, . . . , T .
∗ Sample θprop ∼ N(θt−1,B)
∗ With probability
α = min
[1,p(x,θprop)
p(x,θt−1)
]Set θt = θprop otherwise set θt = θt−1.
2 The above Markov chain can be shown (under mild conditions) to satisfy the
detailed balance condition and converges in distribution to p(θ|x).
2 In practice we can treat samples {θt}Tt=1 as independent samples from p(θ|x)for sufficiently large T .
Workshop on statistical challenges in astronomy 4
Stan: Hamiltonian Monte Carlo
2 In the previous example a multivariate normal distribution was used to generate
proposal samples.
2 Stan uses Hamiltonian dynamics (Hamiltonian Monte Carlo) to generate good
proposal samples and then uses a accept-reject step to ensure that the Markov
chain mimics the posterior distribution.
2 See HMC notes for details.
2 HMC samples are typically much higher quality and require less tuning than
other samplers.
Workshop on statistical challenges in astronomy 5
Stan: Hamiltonian Monte Carlo
Workshop on statistical challenges in astronomy 6
Stan: Where to get help
In anticipation that we will not get through everything today I will begin with a list
of resources on Stan
2 Homepage http://mc-stan.org/
2 User Guide
http://mc-stan.org/users/
2 Documentation, tutorials and case studies
http://mc-stan.org/users/documentation/index.html
2 Nice book
“Bayesian Models for Astrophysical Data Using R, JAGS, Python, and Stan” by
Joseph M. Hilbe, Rafael S. de Souza & Emille E. O. Ishida