YOU ARE DOWNLOADING DOCUMENT

Please tick the box to continue:

Transcript
Page 1: Probabilistic programming2

Probabilistic Programming for Evolutionary Biology

Benjamin Redelings

June 24, 2014

Page 2: Probabilistic programming2

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Page 3: Probabilistic programming2

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Page 4: Probabilistic programming2

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Page 5: Probabilistic programming2

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Page 6: Probabilistic programming2

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Page 7: Probabilistic programming2

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Page 8: Probabilistic programming2

Graphical Models

Page 9: Probabilistic programming2

Graphical Models

Page 10: Probabilistic programming2

Graphical Models

x ∼ normal (if i then y else z , σ2)

Page 11: Probabilistic programming2

Graphical Models

x ∼ normal (if i then y else z , σ2)

Page 12: Probabilistic programming2

Graphical Models

x ∼ normal (if i then y else z , σ2)

Page 13: Probabilistic programming2

Graphical Models

x ∼ normal (if i then y else z .1, σ2)

Page 14: Probabilistic programming2

Extensions of Graphical Models

1. Control flow

I x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Page 15: Probabilistic programming2

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)

I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Page 16: Probabilistic programming2

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]

I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Page 17: Probabilistic programming2

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Page 18: Probabilistic programming2

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Page 19: Probabilistic programming2

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Page 20: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selection

I Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other models

I ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 21: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other models

I ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 22: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other models

I ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 23: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!

I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 24: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)

I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 25: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 26: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 27: Probabilistic programming2

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.I M8 = mixture (beta a b) (\w -> m0(k,w))

Page 28: Probabilistic programming2
Page 29: Probabilistic programming2

Future Work

1. Dynamic instantiation of random variables:

I x = repeat (normal 0 1)I n = geometric 0.5I y = f (take n xs)

Page 30: Probabilistic programming2

Future Work

1. Dynamic instantiation of random variables:I x = repeat (normal 0 1)

I n = geometric 0.5I y = f (take n xs)

Page 31: Probabilistic programming2

Future Work

1. Dynamic instantiation of random variables:I x = repeat (normal 0 1)I n = geometric 0.5

I y = f (take n xs)

Page 32: Probabilistic programming2

Future Work

1. Dynamic instantiation of random variables:I x = repeat (normal 0 1)I n = geometric 0.5I y = f (take n xs)

Page 33: Probabilistic programming2

Source

https://github.com/bredelings/BAli-Phy

Other software for Bayesian InferenceI RevBayesI BEAST 1I BEAST 2I ChurchI Venture

Page 34: Probabilistic programming2

Source

https://github.com/bredelings/BAli-Phy

Other software for Bayesian InferenceI RevBayesI BEAST 1I BEAST 2I ChurchI Venture


Related Documents