Monte Carlo and Molecular Dynamics Tools 1. Introduction to Monte Carlo techniques Torbj¨ orn Sj¨ ostrand Theoretical High Energy Physics Department of Astronomy and Theoretical Physics Lund University S¨ olvegatan 14A, SE-223 62 Lund, Sweden Lund, 5 November 2012
25
Embed
Monte Carlo and Molecular Dynamics Tools 1. Introduction to Monte ...
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Monte Carlo and Molecular Dynamics Tools1. Introduction to Monte Carlo techniques
Torbjorn Sjostrand
Theoretical High Energy PhysicsDepartment of Astronomy and Theoretical Physics
Lund UniversitySolvegatan 14A, SE-223 62 Lund, Sweden
P0: Introduction to Monte Carlo Techniques (TS, A Irback)
P1: Write a parton shower (TS)
P2: Markov chain Monte Carlo simulationof protein fibril formation (A Irback)
P3: Stellar populations with clusters (M Davies)
P4 Monte Carlo simulation of photon interactions with matter(M Ljungberg)
P5: Molecular simulations (P Linse, M Lund)
Px: longer project continuing on either of above
My lectures and other course material can be found athttp://home.thep.lu.se/∼torbjorn/compute2012.html
Torbjorn Sjostrand Introduction to MC techniques slide 2/24
My involvement: next two+ weeks
Today: introduction,integration of 1-dimensional functions, andrandom selection according to them
Tomorrow: random number generators,special tricks, more (but still few) dimensions
(Wednesday, Thursday: Anders lectures)
Friday 17.00: deadline for hand-in of two warmup exercises
Next Monday: problems with time evolution/ordering,and the technology of the main project of the week
Next Tuesday: introduction to particle physics,and the context of the main project of the week
Next Wednesday (– Friday): check progress of project
Next Friday 17.00: deadline for hand-in of project report
Following Tuesday: my feedback
Torbjorn Sjostrand Introduction to MC techniques slide 3/24
Introduction
Buffon’s needle (proposed 1733):probability for needle to cross lineis related to π
. . . but gambling & odds are older
Torbjorn Sjostrand Introduction to MC techniques slide 4/24
Spatial vs. temporal problems
“Spatial” problems: no memory/ordering (this week)
1 Integrate a function
2 Pick a point at random according to a probability distribution
“Temporal” problems: has memory (next week)
1 Radioactive decay: probability for a radioactive nucleusto decay at time t, given that it was created at time 0
In reality combined into multidimensional problems:
1 Random walk (variable step length and direction)
2 Charged particle propagation through matter(stepwise loss of energy by a set of processes)
3 Parton showers (cascade of successive branchings)
Torbjorn Sjostrand Introduction to MC techniques slide 5/24
Random numbers
For now assume algorithm that returns “random numbers” R,uniformly distributed in range 0 < R < 1 and uncorrelated.More explanation/examples tomorrow (but not my expertise).
Task 1: find and learn how to use a random number generatoron your platform of preference.
Fallback: an implementation of the Marsaglia–Zaman–Tsangalgorithm is available on my course page.It allows for ∼ 900 000 000 different sequences,seq numbered from 1 upwards.
C++: Rndm rndm(seq); creates+initializes;rndm.flat(); returns next random number
Fortran: RNDM(seq) initializes first time called;always returns next random number
Adapt and check that it works (rewritten for standalone).
Torbjorn Sjostrand Introduction to MC techniques slide 6/24
Pick among discrete possibilities
Assume n possible outcomes with (unnormalized) probabilities Pi ,1 ≤ i ≤ n. Pick one of them according to
1 i = 0PR = R
∑ni=0 Pi
2 i = i + 1PR = PR − Pi
3 if PR > 0 cycle to 2
x
f(x)
x
x2
!x
00
1
1
0 P1P1 + P2
P1 + P2 + P3P1 + P2 + P3 + P4
PR
Example 1:Poissonian Pi = (〈n〉i/i !) e−〈n〉, i ≥ 0.Note that Pi = (〈n〉/i) Pi−1:
1 i = −1; PR = R; Pnow = e−〈n〉
2 i = i + 1; if (i > 0) Pnow = Pnow (〈n〉/i);PR = PR − Pnow
3 if PR > 0 cycle to 2n
Pn
!n" = 2
0 1 2 3 4 5 6 7
Torbjorn Sjostrand Introduction to MC techniques slide 7/24
Integration and selection
Assume function f (x),studied range xmin < x < xmax,where f (x) ≥ 0 everywhere
Two connected standard tasks:
1 Calculate (approximatively)∫ xmax
xmin
f (x ′) dx ′
2 Select x at random according to f (x)
In step 2 f (x) is viewed as “probability distribution”with implicit normalization to unit area,and then step 1 provides overall correct normalization.
Torbjorn Sjostrand Introduction to MC techniques slide 8/24
Integral as an area/volume
Theorem
An n-dimensional integration ≡ an n + 1-dimensional volume∫f (x1, . . . , xn) dx1 . . .dxn ≡
∫ ∫ f (x1,...,xn)
01 dx1 . . .dxn dxn+1
since∫ f (x)0 1 dy = f (x).
So, for 1 + 1 dimension, selection of x according to f (x) isequivalent to uniform selection of (x , y) in the area
xmin < x < xmax, 0 < y < f (x).Therefore∫ x
xmin
f (x ′) dx ′ = R
∫ xmax
xmin
f (x ′) dx ′
(area to left of selected x is uniformlydistributed fraction of whole area)
Torbjorn Sjostrand Introduction to MC techniques slide 9/24
Integral as an area/volume
Theorem
An n-dimensional integration ≡ an n + 1-dimensional volume∫f (x1, . . . , xn) dx1 . . .dxn ≡
∫ ∫ f (x1,...,xn)
01 dx1 . . .dxn dxn+1
since∫ f (x)0 1 dy = f (x).
So, for 1 + 1 dimension, selection of x according to f (x) isequivalent to uniform selection of (x , y) in the area
xmin < x < xmax, 0 < y < f (x).Therefore∫ x
xmin
f (x ′) dx ′ = R
∫ xmax
xmin
f (x ′) dx ′
(area to left of selected x is uniformlydistributed fraction of whole area)
Torbjorn Sjostrand Introduction to MC techniques slide 9/24
Basic method 1: analytical solution
If know primitive function F (x) and know inverse F−1(y) then
F (x)− F (xmin) = R (F (xmax)− F (xmin)) = R Atot
=⇒ x = F−1(F (xmin) + R Atot)
Proof: introduce z = F (xmin) + R Atot. Then
dPdx
=dPdR
dR
dx= 1
1dxdR
=1
dxdz
dzdR
=1
dF−1(z)dz
dzdR
=dF (x)
dxdzdR
=f (x)
Atot
Example 2:f (x) = 2x , 0 < x < 1, =⇒ F (x) = x2
F (x)− F (0) = R (F (1)− F (0)) =⇒ x2 = R =⇒ x =√
R
Example 3:f (x) = e−x , x > 0, F (x) = 1− e−x
1− e−x = R =⇒ e−x = 1− R = R =⇒ x = − lnR
Torbjorn Sjostrand Introduction to MC techniques slide 10/24
Basic method 2: hit-and-miss
If f (x) ≤ fmax in xmin < x < xmax
use interpretation as an area
1 selectx = xmin + R (xmax − xmin)
2 select y = R fmax (new R!)
3 while y > f (x) cycle to 1
Integral as by-product:
I =
∫ xmax
xmin
f (x) dx = fmax (xmax − xmin)Nacc
Ntry= Atot
Nacc
Ntry
Binomial distribution with p = Nacc/Ntry and q = Nfail/Ntry,so error
δI
I=
Atot
√p q/Ntry
Atot p=
√q
p Ntry=
√q
Nacc<
1√Nacc
Torbjorn Sjostrand Introduction to MC techniques slide 11/24
Hit-and-miss (2)
Example 4:
f (x) = xα, α > 0,
0 ≤ x ≤ 1 ⇒ 0 ≤ f (x) ≤ 1
F (x) = xα+1/(α + 1)
p = I =∫ 10 f (x) dx = 1/(α + 1)
q = 1− I = α/(α + 1) x
f(x)
x
x2
!x
00
1
1
α I√
NtryδI√
Ntry(δI/I )√
Nacc(δI/I )
1 0.5 0.5 1 0.7072 0.333 0.471 1.413 0.816
1/2 0.667 0.471 0.706 0.5779 0.1 0.3 3.0 0.949
1/9 0.9 0.3 0.333 0.316
Torbjorn Sjostrand Introduction to MC techniques slide 12/24
Crude Monte Carlo integration
Hit-and-miss not most efficientintegration: for each x picked,and f (x) evaluated,only accept/reject statistics is used.
Better use full f (x) information:
I =
∫ xmax
xmin
f (x) dx = (xmax − xmin)1
Ntry
Ntry∑i=1
f (xi ) = ∆x 〈f (x)〉
δI =1√Ntry
∆x√〈f 2(x)〉 − 〈f (x)〉2
with 〈f 2(x)〉 =1
Ntry
Ntry∑i=1
f 2(xi )
Torbjorn Sjostrand Introduction to MC techniques slide 13/24
Crude Monte Carlo integration (2)
Example 4 (continued):
f (x) = xα, α > 0, 0 ≤ x ≤ 1
〈f (x)〉 =
∫ 1
0xα dx =
1
α + 1
〈f 2(x)〉 =
∫ 1
0x2α dx =
1
2α + 1
δf =√〈f 2(x)〉 − 〈f (x)〉2 =
α
(α + 1)√
2α + 1
x
f(x)
x
x2
!x
00
1
1
α I√
NtryδIHaM
√NtryδICMCi
1 0.5 0.5 0.2892 0.333 0.471 0.298
1/2 0.667 0.471 0.2369 0.1 0.3 0.206
1/9 0.9 0.3 0.090
Torbjorn Sjostrand Introduction to MC techniques slide 14/24
For comparable number of points N error then scales like
Monte Carlo 1/√
N
Trapezoid 1/N2
Simpson 1/N4
Monte Carlo will not win for 1-dimensional integration.
Torbjorn Sjostrand Introduction to MC techniques slide 15/24
Conventional integration (2)
Game changes for d dimensions:
Monte Carlo 1/√
N
Trapezoid 1/N2/d
Simpson 1/N4/d
Also: 20 dimensions, Simpson ⇒ 320 ≈ 3 · 109 points,all except one on border.
Generally, advantages of simple Monte Carlo integration include
proportionately faster convergence in many dimensions
discontinuous functions no problem
arbitrarily complex integration regions
few points needed to get first estimate
easy error estimate
by-product of Monte Carlo selection of x
Torbjorn Sjostrand Introduction to MC techniques slide 16/24
Detour: stratified sampling
Split integration range into subranges (adjoint, non-overlapping).Assume n subranges, 1 ≤ i ≤ n, ∆xi = xi ,max − xi ,min,and Ni points in respective subrange:
I =n∑
i=1
Ii =n∑
i=1
∆xi 〈f (x)〉i
(δI )2 =n∑
i=1
(∆xi δfi )2
Ni=
n∑i=1
(∆xi )2
Ni(〈f 2(x)〉i − 〈f (x)〉2i )
Uniform stratification: all ∆xi and Ni the same, does reduce δI .Ultimately Ni = 1, n large, ≈ conventional integration.Variance reduction: pick smaller ranges wherever f ′(x) is large,rather than f (x) itself.Wrong way to go for selection according to a distribution!From now on only study techniques that allow (unbiased) selection.
Torbjorn Sjostrand Introduction to MC techniques slide 17/24
Importance sampling
Improved version of hit-and-miss:If f (x) ≤ g(x) inxmin < x < xmax
and G (x) =∫
g(x ′) dx ′ is simpleand G−1(y) is simple
1 select x according to g(x)distribution
2 select y = R g(x) (new R!)
3 while y > f (x) cycle to 1
Example 5:f (x) = x e−x , x > 0Attempt 1: F (x) = 1− (1 + x) e−x not invertibleAttempt 2: f (x) ≤ f (1) = e−1 but 0 < x < ∞
Torbjorn Sjostrand Introduction to MC techniques slide 18/24
Importance sampling (2)
Attempt 3: g(x) = N e−x/2
f (x)
g(x)=
x e−x
N e−x/2=
x e−x/2
N≤ 1
for rejection to work, so find maximum:
ddx
(f (x)
g(x)
)=
1
N
(1− x
2
)e−x/2 = 0 =⇒ x = 2
Normalize so g(2) = f (2) ⇒ N = 2/eG (x) ∝ 1− e−x/2 = R ⇒ x = −2 lnR
1 select x = −2 lnR
2 select y = R g(x) = R 2e−(1+x/2)
3 while y > f (x) = x e−x cycle to 1
efficiency =
∫ ∞0 f (x) dx∫ ∞0 g(x) dx
=e
4
Torbjorn Sjostrand Introduction to MC techniques slide 19/24
Variable transformation
Importance sampling can be reinterpreted as variabletransformation∫
f (x) dx =
∫f (x)
g(x)g(x)dx =
∫f (x)
g(x)dG (x)
map to finite x range
map away singular/peaked regions
Example 6:f (x) = exp(−x2), 1 ≤ x < ∞ (or exp(−xα) with α > 1)def. t = exp(−x) ⇒ x = − ln t, 0 ≤ t ≤ 1/e∫ ∞
1e−x2
dx =
∫ ∞
1
e−x2
e−xe−xdx =
∫ 1/e
0
e− ln2 t
tdt =
∫ 1/e
0t−1−ln t dt
Pick t uniformly in 0 < t ≤ 1/e, repeatedly until t−1−ln t > R,and then obtain x = − ln t
Torbjorn Sjostrand Introduction to MC techniques slide 20/24
Multichannel
If f (x) ≤ g(x) =∑
i gi (x),
where all gi “nice” (Gi (x) invertible)but g(x) not
1 select i with relative probability
Ai =
∫ xmax
xmin
gi (x′) dx ′
2 select x according to gi (x)
3 select y = R g(x) = R∑
i gi (x)
4 while y > f (x) cycle to 1
Works since∫f (x) dx =
∫f (x)
g(x)
∑i
gi (x) dx =∑
i
Ai
∫gi (x) dx
Ai
f (x)
g(x)
Torbjorn Sjostrand Introduction to MC techniques slide 21/24
Multichannel (2)
Example 7:
f (x) =1√
x(1− x), 0 < x < 1
g(x) =1√x
+1√
1− x=
√x +
√1− x√
x(1− x),
1√2≤ f (x)
g(x)≤ 1
1 if R < 1/2 then g1(x) else g2(x)
2 g1: G1(x) = 2√
x = 2R =⇒ x = R2
g2: G2(x) = 2(1−√
1− x) = 2R =⇒ x = 1− R2
3 , 4 as previous page
Torbjorn Sjostrand Introduction to MC techniques slide 22/24
Multichannel – alternative approach
Recall previous formula∫f (x) dx =
∫f (x)
g(x)
∑i
gi (x) dx =∑
i
Ai
∫gi (x) dx
Ai
f (x)
g(x)
Now assume split f (x) =∑
i fi (x), with fi (x) < gi (x).
(Brute-force, always possible with fi (x) = (gi (x)/g(x)) f (x).)
Then∫f (x) dx =
∫ ∑i
fi (x)
gi (x)gi (x) dx =
∑i
Ai
∫gi (x) dx
Ai
fi (x)
gi (x)
1 select i with relative probability Ai
2 select x according to gi (x)
3 select y = R gi (x)
4 while y > fi (x) cycle to 1
Torbjorn Sjostrand Introduction to MC techniques slide 23/24
Summary
Discussed today:Pick among discrete possibilities
Integral as an area/volume
Analytical solution
Hit-and-miss
Crude Monte Carlo integration
Conventional integration
Stratified sampling
Importance sampling
Variable transformations
Multichannel
To come:
Random number generators
Special tricks
Several dimensions
Time evolution/ordering
Combining it: a particle physics example
Material:
Exercises, week 1 and 2
these lecture notes (need more? on what?)Torbjorn Sjostrand Introduction to MC techniques slide 24/24