Top Banner
Semimartingales and stochastic integration Spring 2011 Sergio Pulido * Chris Almost Contents Contents 1 0 Motivation 3 1 Preliminaries 4 1.1 Review of stochastic processes ....................... 4 1.2 Review of martingales ............................ 7 1.3 Poisson process and Brownian motion .................. 9 1.4 Lévy processes ................................. 12 1.5 Lévy measures ................................. 15 1.6 Localization .................................. 20 1.7 Integration with respect to processes of finite variation ........ 21 1.8 Naïve stochastic integration is impossible ................ 23 * [email protected] [email protected] 1
73

public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Jun 23, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Semimartingales and stochastic integrationSpring 2011

Sergio Pulido∗

Chris Almost†

Contents

Contents 1

0 Motivation 3

1 Preliminaries 41.1 Review of stochastic processes . . . . . . . . . . . . . . . . . . . . . . . 41.2 Review of martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71.3 Poisson process and Brownian motion . . . . . . . . . . . . . . . . . . 91.4 Lévy processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121.5 Lévy measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151.6 Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201.7 Integration with respect to processes of finite variation . . . . . . . . 211.8 Naïve stochastic integration is impossible . . . . . . . . . . . . . . . . 23

[email protected][email protected]

1

Page 2: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2 Contents

2 Semimartingales and stochastic integration 242.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242.2 Stability properties of semimartingales . . . . . . . . . . . . . . . . . . 252.3 Elementary examples of semimartingales . . . . . . . . . . . . . . . . 262.4 The stochastic integral as a process . . . . . . . . . . . . . . . . . . . . 272.5 Properties of the stochastic integral . . . . . . . . . . . . . . . . . . . . 292.6 The quadratic variation of a semimartingale . . . . . . . . . . . . . . 312.7 Itô’s formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352.8 Applications of Itô’s formula . . . . . . . . . . . . . . . . . . . . . . . . 40

3 The Bichteler-Dellacherie Theorem and its connexions to arbitrage 433.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 433.2 Proofs of Theorems 3.1.7 and 3.1.8 . . . . . . . . . . . . . . . . . . . . 453.3 A short proof of the Doob-Meyer theorem . . . . . . . . . . . . . . . . 523.4 Fundamental theorem of local martingales . . . . . . . . . . . . . . . 543.5 Quasimartingales, compensators, and the fundamental theorem of

local martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563.6 Special semimartingales and another decomposition theorem for

local martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 583.7 Girsanov’s theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

4 General stochastic integration 654.1 Stochastic integrals with respect to predictable processes . . . . . . 65

Index 72

Page 3: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Chapter 0

Motivation

Why stochastic integration with respect to semimartingales with jumps? To model “unpredictable” events (e.g. default times in credit risk theory)

one needs to consider models with jumps. A lot of interesting stochastic processes jump, e.g. Poisson process, Lévy

processes.This course will closely follow the textbook, Stochastic integration and differentialequations by Philip E. Protter, second edition. We will not cover every chapter, andsome proofs given in the course will differ from those in the text. The followingnumbers correspond to sections in the textbook.

I Preliminaries.1. Filtrations, stochastic processes, stopping times, path regularity, “func-

tional” monotone class theorem, optional σ-algebra.2. Martingales.3. Poisson processes, Brownian motion.4. Lévy processes.6. Localization procedure for stochastic processes.7. Stieltjes integration8. Impossibility of naïve stochastic integration (via the Banach-Steinhaus

theorem).II Semimartingales and stochastic integrals.

1–3. Definition of the stochastic integral with respect to processes in L.5. Properties of the stochastic integral.6. Quadratic variation.7. Itô’s formula.8. Stochastic exponential and Lévy’s characterization theorem.

III Bichteler-Dellacherie theorem.(NFLVR) implies S is a semimartingale(NFLVR) and little investment if and only if S is a semimartingale

IV Stochastic integration with respect to predictable processes and martingalerepresentation theorems (i.e. market completeness).

For more information on the history of the development of stochastic integra-tion, see the paper by Protter and Jarrow on that topic.

3

Page 4: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Chapter 1

Preliminaries

1.1 Review of stochastic processes

The standard setup we will use is that of a complete probability space (Ω,F ,P)and a filtration F = (Ft)0≤t≤∞ of sub-σ-algebras of F . The filtration can bethought of as the flow of information. Expectation E will always be with respectto P unless stated otherwise.

Notation. We will use the convection that t, s, and u will always be real variables,not including∞ unless it is explicitly mentioned, e.g. t|t ≥ 0 = [0,∞). On theother hand, n and k will always be integers, e.g. n : n≥ 0= 0, 1,2, 3, . . . =: N.

1.1.1 Definition. (Ω,F ,F,P) satisfies the usual conditions if(i) F0 contains all the P-null sets.

(ii) F is right continuous, i.e. Ft =⋂

s<tFs.

1.1.2 Definition. A stopping time is a random time, i.e. a measurable functionT : Ω→ [0,∞], such that T ≤ t ∈ Ft for all t.

1.1.3 Theorem. The following are equivalent.(i) T is a stopping time.

(ii) T < t ∈ Ft for all t > 0.

PROOF: Assume T is a stopping time. Then for any t > 0,

T < t=⋃

n≥1

T ≤ t − 1n ∈∨

s<t

Fs ⊆Ft .

Conversely, since the filtration is assumed to be right continuous,

T ≤ t=⋂

n≥1

T < t + 1n ∈⋂

s>t

Fs =Ft+ =Ft ,

so T is a stopping time.

4

Page 5: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.1. Review of stochastic processes 5

1.1.4 Theorem.(i) If (Tn)n≥1 is a sequence of stopping times then

n Tn and∨

n Tn are stoppingtimes.

(ii) If T and S are stopping times then T + S is a stopping time.

1.1.5 Exercises.(i) If T ≥ S then is T − S a stopping time?

(ii) For which constants α is αT a stopping time?

SOLUTION: Clearly αT need not be a stopping time if α < 1. If α≥ 1 then, for anyt ≥ 0, t/α≤ t so αT ≤ t= T ≤ t/α ∈ Ft/α ⊆Ft and αT is a stopping time.

Let H be a stopping time for which H/2 is not a stopping time (e.g. the firsthitting time of Brownian motion at the level 1). Take T = 3H/2 and S = H, bothstopping times, and note that T − S = H/2 is not a stopping time. ú

1.1.6 Definition. Let T be a stopping time. The σ-algebras of events before timeT and events strictly before time T are

(i) FT = A∈ F : A∩ T ≤ t ∈ Ft for all t.(ii) FT− =F0 ∨σA∩ t < T : t ∈ [0,∞), A∈ Ft.

1.1.7 Definition.(i) A stochastic process X is a collection of Rd -valued r.v.’s, (X t)0≤t<∞. A stochas-

tic process may also be thought of as a function X : Ω× [0,∞)→ Rd or as arandom element of a space of paths.

(ii) X is adapted if X t ∈ Ft for all t.(iii) X is càdlàg (X t(ω))t≥0 has left limits and is right continuous for almost all

ω. X is càglàd if instead the paths have right limits and are left continuous.(iv) X is a modification of Y if P[X t 6= Yt] = 0 for all t. X is indistinguishable

from Y if P[X t 6= Yt for some t] = 0.(v) X− := (X t−)t≥0 where X0− := 0 and X t− := lims↑t Xs.

(vi) For a càdlàg process X , ∆X := (∆X t)t≥0 is the process of jumps of X , where∆X t := X t − X t−.

1.1.8 Theorem.(i) If X is a modification of Y and X and Y are left- (right-) continuous then

they are indistinguishable.(ii) If Λ⊆ Rd is open and X is continuous from the right (càd) and adapted then

T := inft > 0 : X t ∈ Λ is a stopping time.(iii) If Λ ⊆ Rd is closed and X is càd and adapted then T := inft > 0 : X t ∈

Λ or X t− ∈ Λ is a stopping time.(iv) If X is càdlàg and adapted and ∆XT 1T<∞ = 0 for all stopping times T then

∆X is indistinguishable from the zero process.

PROOF: Read the proofs of these facts as an exercise.

Page 6: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

6 Preliminaries

1.1.9 Definition. O = σ(X : X is adapted and càdlàg) is the optional σ-algebra.A stochastic process X is an optional process if X is O -measurable.

1.1.10 Theorem (Début theorem). If A∈ O then T (ω) := inft : (ω, t) ∈ A, thedébut time of A, is a stopping time.

Remark. This theorem requires that the filtration is right continuous. For example,suppose that T is the hitting time of an open set by a left continuous process. Thenyou can prove T < t ∈ Ft for all t without using right continuity of the filtration,but you cannot necessarily prove that T = t ∈ Ft without it. You need to “lookinto the future” a little bit.

1.1.11 Corollary. If X is optional and B ⊆ Rd is a Borel set then the hitting timeT := inft > 0 : X t ∈ B is a stopping time.

1.1.12 Theorem. If X is an optional process then(i) X is (F ⊗B([0,∞)))-measurable and

(ii) XT 1T<∞ ∈ FT for any stopping time T .In particular, (X t∧T )t≥0 is also an optional process, i.e. optional processes are “sta-ble under stopping”.

1.1.13 Theorem (Monotone class theorem).Suppose that H is collection of bounded R-valued functions such that

(i) H is a vector space.(ii) 1Ω ∈ H, i.e. constant functions are in H.

(iii) If ( fn)n≥0 ⊆ H is monotone increasing and f := limn→∞ fn (pointwise) isbounded then f ∈ H.

(In this case H is called a monotone vector space.) Let M be a multiplicative col-lection of bounded functions (i.e. if f , g ∈ M then f g ∈ M). If M ⊆ H then Hcontains all the bounded functions that are measurable with respect to σ(M).

PROOF (OF THEOREM 1.1.12): Use the Monotone Class Theorem. Define

M := X : Ω× [0,∞)→ R | X is càdlàg, adapted, and boundedH := X : Ω× [0,∞)→ R | X is bounded and (i) and (ii) hold

It can be checked that H is a monotone vector space, M is a multiplicative collec-tion, and σ(M) = O . If we prove that M ⊆ H then we are done. Let X ∈ M anddefine

X (n) :=∞∑

k=1

X k2n

1[ k−12n , k

2n ).

Since X is right continuous, X (n)→ X pointwise. Let B be a Borel set.

X (n) ∈ B=∞⋃

k=1

n

X k2n∈ Bo

×

k− 1

2n ,k

2n

∈ F ⊗B([0,∞))

Page 7: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.2. Review of martingales 7

This proves that X satisfies (i). To prove (ii), let T be a stopping time and define

Tn :=

¨

k2n if k−1

2n ≤ T < k2n

∞ if T =∞.

Then the Tn’s are stopping times and Tn ↓ T .

XTn∈ B ∩ Tn ≤ t=

∞⋃

k=1k

2n ≤t

n

X k2n∈ Bo

Tn =k

2n

∈ Ft

This shows that XTn∈ FTn

. Since X is right continuous,

XT 1T<∞ = limn→∞

XTn1Tn<∞.

It can be shown that, since the filtration is right continuous,⋂∞

n=1FTn= FT , so

XT 1T<∞ ∈ FT and X satisfies (ii). Therefore X ∈ H.

If X is càdlàg and adapted then an argument similar to that in the previousproof shows that X |Ω×[0,t] ∈ Ft ⊗B([0, t]) for all t, i.e. X is progressively measur-able. By a similar monotone class argument, it can be shown that every optionalprocess is a progressive process.

1.1.14 Definition. V := σ(X : X is progressively measurable) is the progressiveσ-algebra.

1.1.15 Corollary. O ⊆ V .

1.2 Review of martingales

1.2.1 Theorem. Let X be a (sub-, super-) martingale and assume that F satisfiesthe usual conditions. Then X has a right continuous modification if and only if thefunction t 7→ E[X t] is right continuous. Furthermore, this modification has leftlimits everywhere.

PROOF (SKETCH): The process eX t := lim s↓ts∈Q

Xs is the correct modification.

1.2.2 Corollary. Every martingale has a càdlàg modification, unique up to indis-tinguishability.

1.2.3 Theorem. Let X be a right continuous sub-martingale with supt≥0E[X+t ]<∞. Then X∞ := limt→∞ X t exists a.s. and X∞ ∈ L1.

1.2.4 Definition. A collection of random variables (Uα)α∈A is uniformly integrableor u.i. if

limn→∞

supα∈AE[1|Uα|>nUα] = 0.

Page 8: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

8 Preliminaries

1.2.5 Theorem. The following are equivalent for a family (Uα)α∈A.(i) (Uα)α∈A is u.i.

(ii) supα∈AE[|Uα|] < ∞ and for all ε > 0 there is δ > 0 such that if P[Λ] < δthen supα∈AE[1ΛUα]< ε.

(iii) (de la Vallée-Poussin criterion) There is a positive, increasing, convex func-tion G on [0,∞) such that limx→∞

G(x)x=∞ and supα∈AE[G(|Uα|)]<∞.

1.2.6 Theorem. The following are equivalent for a martingale X .(i) (X t)t≥0 is u.i.

(ii) X is closable, i.e. there is an integrable r.v. Z such that X t = E[Z |Ft].(iii) X converges in L1.(iv) X∞ = limt→∞ X t a.s. and in L1 and X∞ closes X .

Remark.(i) If X is u.i. then X is bounded in L1 (but not vice versa), so Theorem 1.2.3

implies that X t → X∞ a.s. Theorem 1.2.6 upgrades this to convergence inL1, i.e. E[|X t − X∞|]→ 0.

(ii) If X is closed by Z then E[Z |F∞] also closes X , where F∞ :=∨

t≥0Ft .Furthermore, X∞ = E[Z |F∞].

1.2.7 Example (Simple random walk). Let (Zn)n≥1 be i.i.d. with

P[Zn = 1] = P[Zn =−1] =1

2.

Take Ft := σZk : k ≤ t and X t :=∑btc

k=1 Zk. Then X is not a closable martingale.

1.2.8 Example. Let Mt := exp(Bt −12

t), the stochastic exponential of Brownianmotion, a martingale. Then Mt → 0 a.s. by Theorem 1.2.3 because it is a positivevalued martingale (hence −M is a sub-martingale with no positive part). ButE[|Mt |] = 1 for all t so M is not u.i. by Theorem 1.2.6.

1.2.9 Theorem.(i) If X is a closable (sub-, super-) martingale and S ≤ T are stopping times

then E[XT |FS] = XS (≥, ≤).(ii) If X is a (sub-, super-) martingale and S ≤ T are bounded stopping times

then E[XT |FS] = XS (≥, ≤).(iii) If X is a right continuous sub-martingale and p > 1 then

‖ supt≥0

X t‖Lp ≤

p

p− 1

supt≥0‖X t‖Lp .

In particular if p = 2 then E[supt≥0 X 2t ]≤ 4 supt≥0E[X 2

t ].(iv) (Jensen’s inequality) If ϕ is a convex function, Z is an integrable r.v., and G

is a sub-σ-algebra of F then ϕ(E[Z |G ])≤ E[ϕ(Z)|G ].

Page 9: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.3. Poisson process and Brownian motion 9

1.2.10 Definition. Let X be a process and T be a random time. The stoppedprocess is X T

t := X t1t≤T + XT 1t>T,T<∞ = XT∧t .

If T is a stopping time and X is càdlàg and adapted then so is X T .

1.2.11 Theorem.(i) If X is a u.i. martingale and T is a stopping time then X T is a u.i. martingale.

(ii) If X is a martingale and T is a stopping time then X T is a martingale.That is, martingales are “stable under stopping”.

1.2.12 Definition. Let X be a martingale.(i) If X t ∈ L2 for all t then X is called a square integrable martingale.

(ii) If (X t)t∈[0,∞) is u.i. and X∞ ∈ L2 then X is called an L2-martingale.

1.2.13 Exercise. Any L2-martingale is a square integrable martingale, but notconversely.

SOLUTION: Let X be an L2-martingale. Then X∞ ∈ L2 and so, by the conditionalversion of Jensen’s inequality,

E[X 2t ] = E[(E[X∞|Ft])

2]≤ E[E[X 2∞|Ft]] = E[X 2

∞]<∞.

Therefore X t ∈ L2 for all t. We have already seen that the stochastic exponentialof Brownian motion is not u.i. (and hence not an L2-martingale) but it is squareintegrable because the normal distribution has a finite valued moment generatingfunction. ú

1.3 Poisson process and Brownian motion

1.3.1 Definition. Suppose that (Tn)n≥1 is a strictly increasing sequence of ran-dom times with T1 > 0 a.s. The process Nt =

n≥1 1Tn≤t is the counting processassociated with (Tn)n≥1. The random time T := supn≥1 Tn is the explosion time. IfT =∞ a.s. then N is a counting process without explosion.

1.3.2 Theorem. A counting process is an adapted process if and only if Tn is astopping time for all n.

PROOF: If N is adapted then t < Tn = Nt < n ∈ Ft for all t and all n. There-fore, for all n, Tn ≤ t ∈ Ft for all t, so Tn is a stopping time. Conversely, if allthe Tn are stopping times then, for all t, Nt ≤ n= t ≤ Tn ∈ Ft for all n. SinceN takes only integer values this implies that Nt ∈ Ft .

1.3.3 Definition. An adapted process N is called a Poisson process if() N is a counting process.

(i) Nt−Ns is independent ofFs for all 0≤ s < t <∞ (independent increments).

Page 10: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

10 Preliminaries

(ii) Nt − Ns(d)= Nt−s for all 0≤ s < t <∞ (stationary increments).

Remark. Implicit in this definition is that a Poisson process does not explode. Thedefinition can be modified slightly to allow this possibility, and then it can beproved as a theorem that a Poisson process does not explode, but the details arevery technical and relatively unenlightening.

1.3.4 Theorem. Suppose that N is a Poisson process.(i) N is continuous in probability.

(ii) Nt(d)= Poisson(λt) for some λ≥ 0, called the intensity or arrival rate of N . In

particular, Nt has finite moments of all orders for all t.(iii) (Nt −λt)t≥0 and ((Nt −λt)2 −λt)t≥0 are martingales.(iv) If F N

t := σ(Ns : s ≤ t) and FN = (F Nt )t≥0 then FN is right continuous.

PROOF: Let α(t) := P[Nt = 0] for t ≥ 0. For all s < t,

α(t + s) = P[Nt+s = 0]= P[Ns = 0 ∩ Nt+s − Ns = 0] non-decreasing and non-negative

= P[Ns = 0]P[Nt+s − Ns = 0] independent increments

= P[Ns = 0]P[Nt = 0] stationary increments

= α(t)α(s)

If tn ↓ t then Ntn= 0 Nt = 0, so α is right continuous and decreasing. It

follows that either α ≡ 0 or α(t) = e−λt for some λ ≥ 0. By the definition ofcounting process N0 = 0, so α is cannot be the zero function.

(i) Observe that, given ε > 0 small, for all s < t,

P[|Nt − Ns|> ε] = P[|Nt−s|> ε] stationary increments

= P[Nt−s > ε] N is non-decreasing

= 1− P[Nt−s = 0] N is integer valued

= 1− e−λ(t−s)

→ 0 as s→ t.

Therefore N is left continuous in probability. The proof of continuity fromthe right is similar.

(ii) First we need to prove that limt→01tP[Nt = 1] = λ. Towards this, let

β(t) := P[Nt ≥ 2] for t ≥ 0. If we can show that limt→0 β(t)/t = 0 then wewould have

limt→0

P[Nt = 1]t

= limt→0

1−α(t)− β(t)t

= λ.

It is enough to prove that limn→∞ nβ(1/n) = 0. Divide [0, 1] into n equalsubintervals and let Sn be the number of subintervals with at least two ar-rivals. It can be see that Sn

(d)= Binomial(n,β(1/n)) because of the stationaryand independent increments of N . In the definition of counting process the

Page 11: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.3. Poisson process and Brownian motion 11

sequence of jump times is strictly increasing, so limn→∞ Sn = 0 a.s. ClearlySn < N1, so limn→∞ nβ(1/n) = limn→∞E[Sn] = 0 provided that E[N1]<∞.We are going to gloss over this point.

Let ϕ(t) := E[γNt ] for 0< γ < 1.

ϕ(t + s) = E[γNt+s] = E[γNsγNt+s−Ns] = E[γNs]E[γNt ] = ϕ(t)ϕ(s)

and ϕ is right continuous because N has right continuous paths. Thereforeϕ(t) = etψ(γ) for some function ψ of γ. By definition,

ϕ(t) =∑

n≥0

γn P[Nt = n]

etψ(γ) = α(t) + γP[Nt = 1] +∑

n≥2

γn P[Nt = n].

Differentiating, ψ(γ) = limt→0(ϕ(t)− 1)/t =−λ+ γλ by the computationsabove. Comparing coefficients of γn in the power series shows that Nt has aPoisson distribution with rate λt, i.e. P[Nt = n] = e−λt(λt)n/n!.

(iii) Exercise.

(iv) See the textbook.

1.3.5 Definition. An adapted process B is called a Brownian motion if(i) Bt−Bs is independent ofFs for all 0≤ s < t <∞ (independent increments).

(ii) Bt − Bs(d)= Normal(0, t − s) for all 0 ≤ s < t < ∞ (stationary, normally

distributed, increments).If B0 ≡ 0 then B is a standard Brownian motion.

1.3.6 Theorem. Let B be a Brownian motion.(i) If E[|B0|]<∞ then B is a martingale.

(ii) There is a modification of B with continuous paths.

(iii) (B2t − t)t≥0 is a martingale when B is standard BM.

(iv) Let Πn be a refining sequence of partitions of the interval [a, a + t] withlimn→∞mesh(Πn) = 0. Then ΠnB :=

tn∈Πn(Bt i+1

− Bt i)2→ t in L2 and a.s.

when B is standard BM.

(v) For almost all ω, the function t 7→ Bt(ω) is of unbounded variation.

Remark. To prove (ii) with the additional assumption that∑

n≥0 mesh(Πn) <∞,but dropping the assumption that the partitions are refining, you can use the Borel-Cantelli lemma. The proof of (iv) uses the backwards martingale convergencetheorem.

Page 12: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

12 Preliminaries

1.4 Lévy processes

1.4.1 Definition. An adapted, real-valued process X is called a Lévy process if() X0 = 0

(i) X t−Xs is independent ofFs for all 0≤ s < t <∞ (independent increments).(ii) X t − Xs

(d)= X t−s for all 0≤ s < t <∞ (stationary increments).(iii) (X t)t≥0 is continuous in probability.

If only (i) and (iii) hold then X is an additive process. If (X t)t≥0 is non-decreasingthen X is called a subordinator.

Remark. If the filtration is not specified then we assume that F= FX is the natural(not necessarily complete) filtration of X . In this case X might then be called anintrinsic Lévy process.

1.4.2 Example. The Poisson process and Brownian motion are both Lévy pro-cesses. Let W be a standard BM and define Tb := inft > 0 : Wt ≥ b. Then(Tb)b≥0 is an intrinsic Lévy process and subordinator for the filtration (FTb

)b≥0.Indeed, if 0 ≤ a < b <∞ then Ta − Tb is independent of FTa

and distributed asTb−a by the strong Markov property of W . We will see that T is a stable processwith parameter α= 1/2.

1.4.3 Theorem. Let X be a Lévy process. Then ft(z) := E[eizX t ] = e−tΨ(z) forsome continuous functionΨ=ΨX . Furthermore, M z

t := eizX t/ ft(z) is a martingalefor all z ∈ R.

PROOF: Fix z ∈ R. By the stationarity and independence of the increments,

ft(z) = ft−s(z) fs(z) for all 0≤ s < t <∞. (1.1)

We would like to show that t 7→ ft(z) is right-continuous. By the multiplicativeproperty (1.1), it suffices to show t 7→ ft(z) is right continuous at zero. Supposethat tn ↓ 0; we need to show that ftn

(z)→ 1. By definition, | ftn(z)| ≤ 1, so we are

done if we can show that every convergent subsequence converges to 1. Supposewithout loss of generality that ftn

(z)→ a ∈ C. Since X is continuous in probability,eizX tn → 1 in probability. Along a subsequence we have convergence almost surely,i.e. ft tnk

(z)→ 1. Therefore a = 1 and we are done.

The multiplicative property and right continuity imply that ft(z) = e−tΨ(z) forsome number Ψ(z). Since ft(z) is a characteristic function, z 7→ ft(z) is contin-uous (use dominated convergence). Therefore Ψ must be continuous as well. Inparticular, ft(z) 6= 0 for all t and all z. Let 0≤ s < t <∞.

E

eizX t

ft(z)

Fs

=eizXs

ft(z)E[eiz(X t−Xs)|Fs] independent increments

=eizXs

ft(z)ft−s(z) stationary increments

Page 13: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.4. Lévy processes 13

=eizXs

fs(z)f is multiplicative.

Therefore M zt is a martingale.

1.4.4 Theorem. If X is an additive process then X is Markov with transition func-tion Ps,t(x , B) = P[X t − Xs ∈ B− x]. In particular, X is spatially homogeneous, i.e.Ps,t(x , B) = Ps,t(0, B− x).

1.4.5 Corollary. If X is additive, ε > 0, and T <∞ then limu↓0αε,T (u) = 0, whereαε,T is defined in the next theorem.

PROOF: Ps,t(x , Bε(x)C) = P(|X t − Xs| ≥ ε) → 0 as s → t uniformly on [0, T] inprobability. It is an exercise to show that the continuity in probability of X impliesuniform continuity in probability on compact intervals.

1.4.6 Theorem. Let X be a Markov process on Rd with transition function Ps,t . Iflimu↓0αε,T (u) = 0, where

αε,T (u) = supPs,t(x , Bε(x)C) : x ∈ Rd , s, t ∈ [0, T], 0≤ t − s ≤ u,

then X has a càdlàg modification. If furthermore limu↓0αε,T (u)/u = 0 then X hasa continuous modification.

PROOF: See Lévy processes and infinitely divisible distributions by Kin-Iti Sato. Theimportant steps are as follows.

Fix ε > 0 andω ∈ Ω. Say that X (ω) has an ε-oscillation n-times in M ⊆ [0,∞)if there are t0 < t1 < · · · < tn all in M such that |X t j

(ω)− X t j−1(ω)| ≥ ε. X has

ε-oscillation infinitely often in M if this holds for all n. Define

Ω′2 :=∞⋂

N=1

∞⋂

k=1

ω : X (ω) does not have 1k-oscillation infinitely often in [0, N]∩Q

It can be shown that Ω′2 ∈ F , and also that

Ω′2 ⊆n

ω : lims↓t,s∈Q

Xs exists for all t and lims↑t,s∈Q

Xs exists for all t > 0o

.

The hard part is to show that P[Ω′2] = 1.

Remark.(i) If X is a Feller process then X has a càdlàg modification (see Revuz-Yor).

(ii) From now on we assume that we are working with a càdlàg version of anyLévy process that appears.

(iii) P[X t 6= X t−] = 0 for any process X that is continuous in probability. Ofcourse, this does not mean that X doesn’t jump, it means that X has no fixedtimes of discontinuity.

Page 14: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

14 Preliminaries

1.4.7 Theorem. Let X be a Lévy process andN be the collection of P-null sets. IfGt :=F X

t ∨N for all t then G= (Gt)t≥0 is right continuous.

PROOF: Let (s1, . . . , sn) and (u1, . . . , un) be arbitrary vectors in (R+)n and Rn re-spectively. We first want to show that

E[ei(u1Xs1+···+unXsn )|Gt] = E[ei(u1Xs1

+···+unXsn )|Gt+].

It is clear that it suffices to show this with s1, . . . , sn > t. We take n = 2 fornotational simplicity, and assume that s2 ≥ s1 > t.

E[ei(u1Xs1+u2Xs2

)|Gt+] = limw↓tE[ei(u1Xs1

+u2Xs2)|Gw] exercise

= limw↓tE[eiu1Xs1 Mu2

s2|Gw] fs2

(u2)

= limw↓tE[eiu1Xs1 Mu2

s1|Gw] fs2

(u2) tower property

= limw↓tE[ei(u1+u2)Xs1 |Gw] fs2−s1

(u2) f is multiplicative

= limw↓t

ei(u1+u2)Xw

fw(u1 + u2)fs1(u1 + u2) fs2−s1

(u2) M is a martingale

= ei(u1+u2)X t fs1−t(u1 + u2) fs2−s1(u2) X is càdlàg

By the same steps, we obtain

E[ei(u1Xs1+u2Xs2

)|Gt] = ei(u1+u2)X t fs1−t(u1 + u2) fs2−s1(u2).

By a monotone class argument, E[Z |Gt] = E[Z |Gt+] for any bounded Z ∈ F X∞. It

follows that Gt+ \ Gt consists only of sets from N . Since N ⊆ Gt , they must beequal.

1.4.8 Corollary (Blumenthal 0-1 Law). Let X be a Lévy process. If A∈⋂

t>0FXt

then P[A] = 0 or 1.

1.4.9 Theorem. If X is a Lévy process and T is a stopping time then, on T <∞,Yt := XT+t − XT is a Lévy process with respect to (Ft+T )t≥0 and Y has the samefinite dimensional distributions as X .

1.4.10 Corollary. If X is a Lévy process then X is a strong Markov process.

1.4.11 Theorem. If X is a Lévy process with bounded jumps then E[|X t |n] <∞for all t and all n.

PROOF: Suppose that supt |∆X t | ≤ C . Define a sequence of random times Tnrecursively as follows. T0 ≡ 0 and

Tn+1 =

¨

inft > Tn : |X t − XTn| ≥ C if Tn <∞

∞ if Tn =∞

Page 15: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.5. Lévy measures 15

By definition of the Tn, sups |X Tns | ≤ 2nC . The 2 comes from the possibility that X

could jump by C just before hitting the stopping level. Since X is càdlàg, the Tnare all stopping times. Further, since X is a strong Markov process on Tn <∞,

(i) Tn − Tn−1 is independent of FTn−1and

(ii) Tn − Tn−1(d)= T1.

Therefore

E[e−Tn] = E n∏

k=0

e−(Tk−Tk−1)

≤ (E[e−T1])n =: αn,

where 0≤ α < 1. The ≤ comes from the fact that e−∞ = 0 and some of the Tn maybe ∞. (We interpret Tk − Tk−1 to be ∞ if both of them are ∞.) By Chebyshev’sinequality,

P[|X t |> 2nC]≤ P[Tn < t]≤E[e−Tn]

e−t ≤ αnet .

Finally,

E[eβ |X t |]≤ 1+∞∑

n=0

E[eβ |X t |12nC<|X t |≤2(n+1)C]

≤ 1+∞∑

n=0

e2β(n+1)C)P[|X t |> 2nC]

≤ 1+ e2βC et∞∑

n=0

(αe2βC)n

Choosing an appropriately small, positive β shows that |X t | has an exponentialmoment, so it has polynomial moments of all orders.

1.5 Lévy measures

Suppose that Λ ∈B(R) and 0 /∈ Λ. Let X be a Lévy process (with càdlàg paths, asalways) and define inductively a sequence of random times T 0

Λ ≡ 0 and

T n+1Λ := inft > T n

Λ :∆X t ∈ Λ.

These times have the following properties.(i) T n

Λ ≥ t ∈ Ft+ = Ft by the usual conditions and (T nΛ)n≥1 is an increasing

sequence of (possibly∞-valued) stopping times.(ii) T 1

Λ > 0 a.s. since X0 ≡ 0, X has càdlàg paths, and 0 /∈ Λ.(iii) limn→∞ T n

Λ = ∞ since X has càdlàg paths and 0 /∈ Λ (see Homework 1,problem 10).

Let NΛ be the number of jumps with size in Λ before time t.

NΛt :=∑

0<s≤t

1Λ(∆Xs) =∞∑

n=1

1T nΛ≥t

Page 16: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

16 Preliminaries

Since X is a strong Markov process, NΛ has stationary and independent incre-ments. It is also a counting process with no explosions, so it is a Poisson process.By Theorem 1.1.12(i) we can define ν(Λ) := E[NΛ1 ], the intensity of NΛ. We canextend this definition to the case when 0 ∈ Λ, so long as 0 /∈ Λ by taking ν(Λ) =∞whenever

0<s≤t 1Λ(∆Xs) would fail to converge. Define ν(0) = 0.

1.5.1 Theorem. For each t and ω the map Λ 7→ NΛt (ω) is a Borel counting mea-sure on R \ 0, and ν(Λ) := E[NΛ1 ] is also a Borel measure on R \ 0.

1.5.2 Definition. ν is the Lévy measure associated with X .

1.5.3 Example. If X is a Poisson process of rate λ then ν = λδ1.

1.5.4 Theorem. If X is a Lévy process then it can be decomposed as X = Y + Zwhere Y is a Lévy process and martingale with bounded jumps (so Yt ∈

p≥1 Lp

for all t) and Z is a Lévy process with paths of finite variation on compact subsetsof [0,∞).

1.5.5 Theorem. Let X be a Lévy process with jumps bounded by C . The processZt := X t −E[X t] is a martingale that can be decomposed as Z = Z c + Zd , whereZ c and Zd are independent Lévy processes. Moreover,

Zdt :=

|x |≤C

x(N(t, d x)− tν(d x)),

where N(t, d x) is the measure of Theorem 1.5.1, and the remainder Z c has con-tinuous paths.

1.5.6 Lemma. If X is a subordinator with continuous paths then X t = c t for someconstant c.

PROOF: X ′(0) = limε↓0 Xε/ε exists a.s., and since X ′(0) ∈ F0, it is a constant a.s.Let δ > 0 and inductively define a sequence of stopping times by T0 = 0 and

Tk+1 := inft > Tk : X t − XTk≥ (c+δ)t.

Then, since X is continuous, XT1= (c + δ)T1 and X t ≤ (c + δ)t for t ≤ T1.

Additionally, X t ≤ (c+δ)t for t ≤ Tk. Because X is a Lévy process, (Tk+1− Tk)k≥0are strictly positive i.i.d. random variables. By the strong law of large numbersTk →∞ as k→∞, so X t ≤ (c + δ)t for all t. Therefore X t ≤ c t for all t, and theproof that X t ≥ c t for all t is similar.

1.5.7 Theorem (Lévy-Itô decomposition). If X is a Lévy process then there areconstants σ and α such that

X t(d)= σBt +αt +

|x |<1

x(N(t, d x)− tν(d x)) +∑

0<s≤t

∆Xs1|∆Xs |≥1

where B is a standard Brownian motion and α= E[X1 −∫

|x |≥1xN(1, d x)].

Page 17: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.5. Lévy measures 17

Remark. X has finite variation on compacts if either(i) σ = 0 and ν(R)<∞ or

(ii) σ = 0 and ν(R) =∞ but∫

|x |<1|x |ν(d x)<∞.

If σ 6= 0 or∫

|x |<1|x |ν(d x) = ∞ then X has infinite variation on compacts. To

prove that the continuous part of X is σB we will need Lévy’s characterization ofBM, proven later.

1.5.8 Lemma. Let X be a Lévy process, Λ ⊆ R \ 0 be Borel with 0 /∈ Λ, and letf be Borel measurable and finite valued on Λ.

(i) The process Y defined by

Yt :=

Λ

f (x)N(t, d x) =∑

0<s≤t

f (∆Xs)1∆Xs∈Λ

is a Lévy process. (Show as an exercise that Y is continuous in probability.)(ii) Let Mt := X t − JΛt where JΛ is defined by

JΛt :=

Λ

xN(t, d x) =∑

0<s≤t

f (∆Xs)1∆Xs∈Λ.

Then M is a Lévy process with jumps outside of Λ. In particular, if we takeΛ = x : |x | ≥ 1 then M is a Lévy process with bounded jumps.

(iii) If f 1Λ ∈ L1(ν) then E[∫

Λf (x)N(t, d x)] = t

Λf (x)ν(d x)]. If f 1Λ ∈ L2(ν)

then E[(∫

Λf (x)N(t, d x)− t

Λf (x)ν(d x))2] = t

Λf (x)2ν(d x).

(iv) If Λ1∩Λ2 =∅ then the process (∑

0<s≤t∆Xs1Λ1(∆Xs))t≥0 is independent of

the process (∑

0<s≤t∆Xs1Λ2(∆Xs))t≥0.

PROOF (OF THEOREM 1.5.4): Let Jt :=∑

0<s≤t∆Xs1|∆Xs |≥1. J has finite variationon compacts since it is increasing and jumps finitely many times in any boundedinterval. By lemma (ii), X − J is a Lévy process with bounded jumps. It canbe shown (as an exercise) that E[X t − Jt] = αt, where α := E[X1 − J1]. ThenYt := X t − Jt −αt is a martingale and Zt := Jt +αt is a Lévy process with paths offinite variation on compacts.

X t = X t − Jt −αt︸ ︷︷ ︸

Yt

+ Jt +αt︸ ︷︷ ︸

Zt

.

PROOF (OF THEOREM 1.5.5): That Zt := X t − E[X t] is a martingale is an exercise.Suppose without loss of generality that the jumps of X are bounded by 1. DefineΛk := 1

k+1< |x | ≤ 1

k and MΛk

t :=∫

ΛkxN(t, d x)−t

Λkxν(d x). Then the MΛk are

martingales by lemma (ii) and are in L2 by lemma (iii) and they are independentby lemma (iv). Let M n :=

∑nk=1 MΛk .

Var(M n) =n∑

k=1

Var(MΛk) =n∑

k=1

t

Λk

x2ν(d x) = t

1n+1<|x |≤1

x2ν(d x)

Page 18: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

18 Preliminaries

Prove as an exercise that Z − M n is independent of M n (it is not an immediateconsequence of the lemma). Therefore Var(M n) = Var(Zt) − Var(Z − M n) < ∞since Var(Zt)<∞ because Z has bounded jumps.

Since M n is a sum of independent random variables we can take the limit asn → ∞ in L2 (fill in the details as an exercise). Let Z c := limn→∞(Z − M n) andZd := limn→∞M n. Then Z c is independent of Zd since Z − M n is independent ofM n for all n. The formula for Zd is immediate.

To show Z c is continuous, note first that Z − M n has jumps bounded by 1n+1

.Something is missing from the lastpart of this proof. By Doob’s maximal inequality sup0<s≤t(Zs −M n

s ) converges in L2. Along a subse-quence Z −M n→ Z c uniformly on compacts so Z c is continuous.

1.5.9 Theorem (Lévy-Khintchine formula). Let X be a Lévy process with char-acteristic function E[eizX t ] = e−tΨ(z). Then the Lévy-Khintchine exponent is

Ψ(z) =σ2z2

2− iαz+

R(1− eizx + izx1|x |<1)ν(d x).

PROOF (SKETCH): In the Lévy-Itô decomposition it can be shown that the parts areall independent, so the characteristic functions multiply. Show as an exercise thatthe characteristic function of

0<s≤t

∆Xs1|∆Xs |≥1 is et∫

|x |≥1(eizx−1)ν(d x)

and the characteristic function of∫

|x |<1

x(N(t, d x)− tν(d x)) is et∫

|x |<1(1−eizx+izx)ν(d x).

1.5.10 Definition. A probability distribution F on R is an infinitely divisible distri-bution if for all n there are X1, . . . , Xn i.i.d. such that X1 + · · ·+ Xn

(d)= F .

1.5.11 Theorem.(i) If X is a Lévy process then, for all t, X t has an infinitely divisible distribution.

(ii) Conversely, if F is an infinitely divisible distribution then there is a Lévyprocess X such that X1

(d)= F .(iii) If F is an infinitely divisible distribution then its characteristic function has

the form of the Lévy-Khintchine exponent with t = 1, where ν is a measureon R such that ν(0) = 0 and

R(|x |2 ∧ 1)ν(d x) <∞, and the representa-

tion of F in terms of (σ,α,ν) is unique.

1.5.12 Examples.(i) If X ∼ Poisson(λ) then the characteristic function is eλt(eiz−1), so σ = α = 0

and ν = λδ1.

Page 19: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.5. Lévy measures 19

(ii) The Γ-process is the Lévy process such that X1 has a Γ-distribution, whereP[X1 ∈ d x] = bc

Γ(c)x c−1ebx1x>0d x for some constants b, c > 0. The Lévy-

Khintchine exponent is

c

R(eizx − 1)

e−bx

x1x>0d x .

In this case ν(d x) = e−bx/x1x>0d x , σ = 0, and α = cb(1− e−b) > 0. The

paths of X are non-decreasing since it has positive drift, no volatility, andpositive jumps. Because ν((0,1)) =∞, X has infinite activity.

(iii) Stable processes are those with Lévy-Khintchine exponent of the form

izδ+m1

∫ ∞

0

eizx − 1−izx

1+ x2

1

x1+γ d x

+m2

∫ 0

−∞

eizx − 1−izx

1+ x2

1

x1+γ d x

where 0 < γ < 2. When m1 = m2 one can take ν(d x) = 1|x |1+γ d x . In this

case there are Y1, Y2, . . . i.i.d. and constants an, and bn such that

1

an

∞∑

i=1

Yi − bn(w)−→ X1

and (β−1γ Xβ t)t≥0

(d)= (X t)t≥0 for all β > 0. If 1≤ α < 2 then∫

|x |ν(d x) =∞so the process would have infinite variation on compacts.

(iv) For the hitting time process (Tb)b≥0 of standard Brownian motion B, wehave

P[Tb ∈ d t] =b

p

2πt3e−

b2

2t 1t>0d t

so the Lévy-Khintchine exponent is

bp

∫ ∞

0

(eizx − 1)x−32 d x .

It follows that T is a stable process with γ= 12. Indeed,

1

β2 Tβ b =1

β2 inft : Bt = β b

= inft/β2 : Bt/β = b

= inft : Bβ2 t/β = b (d)= Tb

since (Bβ2 t/β)t≥0(d)= B. Also note that T is non-decreasing.

Page 20: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

20 Preliminaries

1.6 Localization

1.6.1 Definition.(i) If C is a collection of processes then the localized class, Cloc, is the collection

of processes X such that there exists a sequence (Tn)n≥1 of stopping timessuch that Tn ↑ ∞ a.s. and X Tn ∈ C for all n. The sequence (Tn)n≥1 is called alocalizing sequence for X relative to C .

(ii) If C is the collection of adapted, càdlàg, uniformly integrable martingalesthen an element of the localized class is called a local martingale.

(iii) A stopping time T reduces X relative to C if X T ∈ C .(iv) C is stable under stopping if X T ∈ C for all X ∈ C and all stopping times T .

1.6.2 Theorem. Suppose that C is a collection of processes that is stable understopping.

(i) Cloc is stable under stopping and (Cloc)loc = Cloc. In particular, a local localmartingale is a local martingale.

(ii) If T reduces X relative to C and S ≤ T a.s. then S reduces X .(iii) If M and N are local martingales then M + N is a local martingale. If S and

T reduce M then S ∨ T reduces M .(iv) (C ∩ C ′)loc = Cloc ∩ C ′loc.

PROOF (OF (I)): Let X ∈ Cloc and T be a stopping time. If (Tn)n≥1 is a localizingsequence for X then (X T )Tn = (X Tn)T ∈ C since C is stable under stopping. There-fore (Tn)n≥1 is a localizing sequence for X T and it is seen that Cloc is stable understopping.

Now let X ∈ (Cloc)loc and (Tn) be a localizing sequence for X relative to Cloc,so that X Tn ∈ Cloc for all n. Then for each n there are (Tn,p)p≥1 such that Tn,p ↑ ∞a.s. as p→∞ and (X Tn)Tn,p ∈ C for all p. For all n there is pn such that

P[Tn,pn< Tn ∧ n]≤

1

2n

since a.s. convergence implies convergence in probability. Define a new sequenceof stopping times Sn := Tn ∧

m≥n Tm,pm. Then, for all n, Sn ≤ Sn+1 and

P[Sn < Tn ∧ n]≤∞∑

m=n

1

2m =1

2n−1 .

By the Borel-Cantelli lemma, P[Sn < Tn ∧ n i.o.] = 0, which implies that Sn ↑ ∞a.s. Finally, X Sn = ((X Tn)Tn,pn )Sn ∈ C since C is stable under stopping, so (Sn)n≥1 isa localizing sequence for X relative to C .

1.6.3 Corollary.(i) If X is a locally square integrable martingale then X is a local martingale

(but the converse does not hold).(ii) If M is a process and there are stopping times (Tn)n≥1 such that Tn ↑ ∞ a.s.

and M Tn is a martingale for all n then M is a local martingale.

Page 21: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.7. Integration with respect to processes of finite variation 21

We will see that it is important to be able to decide when a local martingale a“true” martingale. To this end, let X ∗t := sups≤t |Xs| and X ∗ = sups≥0 |Xs|.

1.6.4 Theorem. Let X be a local martingale.(i) If E[X ∗t ]<∞ for all t then X is a martingale.

(ii) If E[X ∗]<∞ then X is a u.i. martingale.

PROOF: Let (Tn)n≥1 be a localizing sequence for X . For any s ≤ t,

E[XTn∧t |Fs] = XTn∧t

by the optional stopping theorem. Since XTn∧t is dominated by the integrable ran-dom variable X ∗t , we may apply the (conditional) dominated convergence theoremto both sides. Therefore X is a martingale. If E[X ∗] <∞ then the family (X t)t≥0is dominated by the integrable random variable X ∗, so it is u.i.

1.6.5 Corollary.(i) If X is a bounded local martingale then it is a u.i. martingale.

(ii) If X is a local martingale and a Lévy process then X is a martingale.(iii) If (Xn)n≥1 is a discrete time local martingale and E |Xn|<∞ for all n then X

is a martingale.

1.6.6 Example. Suppose (An)n≥1 is a measurable partition of Ω with P[An] = 2−n

for all n. Further suppose (Zn)n≥1 is a sequence of random variables independentof the An and such that P[Zn = 2n] = P[Zn =−2−n] = 1/2. Define

Ft :=

¨

σ(An : n≥ 1) for 0≤ t < 1

σ(An, Zn : n≥ 1) for t ≥ 1

completed with respect to P. Define Yn :=∑

1≤k≤n Zk1Anand Tn := ∞1⋃

1≤k≤n Ak.

Let X t be zero for t < 1 and Y∞ for t ≥ 1. Then (Tn)n≥1 is a localizing sequencefor X and X Tn

t is zero for t < 1 and Yn for t ≥ 1. Since Yn is bounded for each n,X Tn is a u.i. martingale, but X is not a martingale because X1 = Y∞ /∈ L1.

1.7 Integration with respect to processes of finite variation

1.7.1 Definition. We say that a process A is increasing or of finite variation if ithas that property path-by-path for almost all ω.

1.7.2 Definition. Let A be of finite variation. The total variation process is

|A|t := supn≥1

2n∑

k=1

|At(k+1)2−n − Atk2−n |.

Note that |A|t <∞ a.s., |A| is increasing, and if A is adapted then so is |A|.

Page 22: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

22 Preliminaries

1.7.3 Definition. Let A be of finite variation and F(ω, s) be (F ⊗B)-measurableand bounded.

(F · A)t(ω) :=

∫ t

0

F(s,ω)dAs(ω),

the path-by-path Lebesgue-Stieltjes integral (which is well-defined and finite foralmost all ω).

The integral process is also of finite variation and if A is (right) continuousthen the integral process is (right) continuous. If F is continuous path-by-paththen then integral may be taken to be the Riemann-Stieltjes integral.

1.7.4 Theorem. Let A and C be adapted, increasing processes such that C − A isincreasing. There exists H jointly measurable and adapted such that A= H · C .

PROOF (SKETCH): The correct process is

H(t,ω) := limr↑1,r∈Q+

A(ω, t)− A(ω, r t)C(ω, t)− C(ω, r t)

.

1.7.5 Corollary. If A is of finite variation then there is a jointly measurable processH with −1≤ H ≤ 1 such that A= H · |A| and |A|= H · A.

PROOF (SKETCH): Let A+ := (|A| + A)/2 and A− := (|A| − A)/2. These are bothincreasing processes, so by Theorem 1.7.4 there are H+ and H− such that A+ =H+ · |A| and A− = H− · |A|. Take H := H+ −H−.

1.7.6 Theorem.(i) Let A be of finite variation, H have continuous paths, and (Πn)n≥1 be a

sequence of partitions of [0, t] with mesh size converging to zero. For eacht i ∈ Πn let si denote any number in the interval [t i , t i+1]. Then

limn→∞

Πn

Hsi(At i+1

− At i) =

∫ t

0

HsdAs.

(ii) (Change of variable formula.) Let A be of finite variation with continuouspaths and f ∈ C1(R). Then ( f (At))t≥0 is of finite variation and

f (At) = f (A0) +

∫ t

0

f ′(As)dAs.

(iii) Let A be of finite variation with continuous paths g be a continuous function.

Then then∫ t

0g(As)dAs =

∫ At

0g(s)ds.

Page 23: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

1.8. Naïve stochastic integration is impossible 23

PROOF (OF (II)): Let (Πn)n≥1 be a sequence of partitions of [0, t] as in (i) andconsider

f (At)− f (A0) =∑

Πn

f (At i+1)− f (At i

)

=∑

Πn

f ′(Asi)(At i+1

− At i) for some si by the MVT

→∫ t

0

f ′(As)dAs by part (i)

1.8 Naïve stochastic integration is impossible

1.8.1 Theorem (Banach-Steinhaus). Let X be a Banach space and let Y be anormed linear space. Suppose (Tα)α∈I is a collection of continuous linear opera-tors from X to Y such that supα∈I ‖Tαx‖Y <∞ for all x ∈ X . Then it is the casethat supα∈I ‖Tα‖L(X ,Y ) <∞.

1.8.2 Theorem. Let x : [0,1] → R be right continuous and let (Πn)n≥1 be a se-quence of partitions of [0,1] with mesh size converging to zero. If

Sn(h) :=∑

Πn

h(t i)(x t i+1− x t i

)

converges for all h ∈ C([0,1]) then x is of finite variation.

PROOF: Note that Sn is a linear operator from C([0, 1]) to R, and

‖Sn‖ ≤∑

Πn

|x t i+1− x t i

|.

This upper bound is achieved for each n by the piecewise linear function hn withthe property that h(t i) = sign(x t i+1

− x t i). If Sn(h) exists for every h then the

Banach-Steinhaus theorem implies that supn≥1 ‖Sn‖ <∞, i.e. that the total varia-tion of x over [0,1] is finite.

Remark. It should be possible to prove that if X is a right continuous processand (Πn)n≥1 is a sequence of partitions of [0, t] mesh size converging to zerothen if

t i ,t i+1∈ΠnHt i(X t i− X t i−1

) converges in probability for all H ∈ F ⊗B withcontinuous paths then X is of finite variation a.s.

Page 24: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Chapter 2

Semimartingales and stochasticintegration

2.1 Introduction

2.1.1 Definition. A process H is a simple predictable process if it has the form

Ht = H010(t) +n−1∑

i=1

Hi1(Ti ,Ti+1](t)

where 0 = T0 ≤ T1 ≤ · · · ≤ Tn <∞ are stopping times and Hi ∈ L∞(FTi) for all

i = 0, . . . , n. The collection of all simple predictable processes will be denoted S. Anorm on S is ‖H‖u := sups≥0 ‖Hs‖∞. When we endow S with the topology inducedby ‖ · ‖u we write Su.

Remark. S⊆P .

2.1.2 Definition. L0 := L0(Ω,F ,P) := X ∈ F : X is finite-valued a.s. with thetopology induced by convergence in probability under P.

Remark. L0 is not locally convex if P is non-atomic. The topological dual of L0 inthis case is 0.

2.1.3 Definition. Let H ∈ S have its canonical representation. For an arbitrarystochastic process X , the stochastic integral of H with respect to X is

IX (H) := H0X0 +n−1∑

i=1

Hi(XTi+1− XTi

).

A càdlàg adapted process X is called a total semimartingale if the map IX : Su→ L0

is continuous. X is a semimartingale if X t is a total semimartingale for all t ≥ 0.

24

Page 25: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.2. Stability properties of semimartingales 25

2.2 Stability properties of semimartingales

2.2.1 Theorem.(i) semimartingales and total semimartingales are vector spaces.

(ii) If Q P then any (total) semimartingale under P is also a (total) semi-martingale under Q.

PROOF:(i) Sums and scalar multiples of continuous operations are continuous.

(ii) Convergence in P-probability implies convergence in Q-probability.

2.2.2 Theorem (Stricker). Let X be a semimartingale with respect to the filtra-tion F and G is a sub-filtration of F (i.e. Gt ⊆Ft for all t). If X is G adapted thenX is a semimartingale with respect to G.

PROOF: S(G)⊆ S(F) and the restriction of a continuous operation is continuous.

2.2.3 Theorem. Let A = (Aα)α∈I be a collection of pairwise disjoint events fromF and let Ht = σ(Ft ∨ A). If X is a semimartingale with respect to F then X is asemimartingale with respect to H.

Remark. This theorem is called Jacod’s countable expansion theorem in the text-book.

PROOF: Note first that Ht = σ(Ft ∨ (A \ A∈ A : P[A] = 0) since F is complete,and second that A ∈ A : P[A] ≥ 1/n is finite because the events are pairwisedisjoint. It follows that we may assume without loss of generality that A is acountable collection (An)n≥1. If C :=

⋃∞n=1 An then we may assume without loss of

generality that CC ∈A, i.e. that A is a (countable) partition of Ω into events withpositive probability.

For each n define Qn := P[·|An], so that Qn P. By Theorem 2.2.1 X is an(F,Qn)-semimartingale. Let Jn be the filtration F completed with respect to Qn.

Claim. X is a (Jn,Qn)-semimartingale.

Indeed, if H ∈ S(Jn,Qn) has its canonical representation then the Ti are Jn-stopping times and the Hi are in L∞(Ω,J n

Ti,Qn).

Fact. Any Jn-stopping time is Qn-a.s. equal to a F-stopping time.

Denote the corresponding F-stopping times by T ′i . Then any Hi ∈ J nTi

is Qn-a.s.equal to an r.v. in FT ′i

(technical exercise). This shows that H is Qn-a.s. equal to aprocess in S(F,Qn), which proves the claim.

Note finally that Qn[Am] ∈ 0, 1 for all Am ∈ A, so F ⊆ H ⊆ Jn for all n. ByStricker’s theorem, X is an (H,Qn)-semimartingale for all n. It can be shown that,since dP=

∑∞n=1 P[An]dQn, X is an (H,P)-semimartingale.

Page 26: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

26 Semimartingales and stochastic integration

2.2.4 Theorem. Let X be a process and (Tn)n≥1 be a sequence of random timessuch that X Tn is a semimartingale for all n and Tn ↑ ∞ a.s. Then X is a semimartin-gale.

PROOF: Fix t > 0. There is n0 such that P[Tn ≤ t]< ε for all n≥ n0. Suppose thatHk ∈ S are such that ‖Hk‖∞ → 0. Let k0 be such that P[|IX Tn0 (Hk)| > α] < ε forall k ≥ k0. Then

P[|IX t (Hk)|> α] = P[|IX t (Hk)|> α; Tn0≤ t] + P[|I(X Tn0 )t (H

k)|> α; Tn0> t]< 2ε

for all k ≥ k0. Therefore IX t (Hk)→ 0 in probability, so X is a semimartingale.

2.3 Elementary examples of semimartingales

2.3.1 Theorem. Every adapted càdlàg process with paths of finite (total) varia-tion is a (total) semimartingale.

PROOF: Suppose that the total variation of X is finite. It is easy to see that, for allH ∈ S, |IX (H)| ≤ ‖H‖∞

∫∞0

d|X |s.

2.3.2 Theorem. Every càdlàg L2-martingale is a total semimartingale.

PROOF: Without loss of generality X0 = 0. Let H ∈ S have the canonical represen-tation.

E[(IX (H))2] = E

n−1∑

i=1

Hi(XTi+1− XTi

)2

= E n−1∑

i=1

H2i (XTi+1

− XTi)2

X is a martingale

≤ ‖H‖2uE n−1∑

i=1

(XTi+1− XTi

)2

= ‖H‖2uE[X

2Tn] X is a martingale

≤ ‖H‖2uE[X

2∞] by Jensen’s inequality

Therefore if Hk → H in Su then IX (Hk)→ IX (H) in L2, so also in probability.

2.3.3 Corollary.(i) A càdlàg locally square integrable martingale is a semimartingale.

(ii) A càdlàg local martingale with bounded jumps is a semimartingale.(iii) A local martingale with continuous paths is a semimartingale.(iv) Brownian motion is a semimartingale.(v) If X = M + A is càdlàg, where M is a locally square integrable martingale

and A is locally of finite variation, then X is a semimartingale. Such an X iscalled a decomposable process

(vi) Any Lévy process is a semimartingale by the Lévy-Itô decomposition.

Page 27: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.4. The stochastic integral as a process 27

2.4 The stochastic integral as a process

2.4.1 Definition.(i) D := adapted, càdlàg processes

(ii) L := adapted, càglàd processes(iii) If C is a collection of processes then bC is the collection of bounded processes

from C.

2.4.2 Definition. For processes Hn and H we say that Hn→ H uniformly on com-pacts in probability (or u.c.p.) if

limn→∞Ph

sup0≤s≤t

|Hns −Hs| ≥ ε

i

= 0

for all ε > 0 and all t. Equivalently, if (Hn −H)∗t(p)−→ 0 for all t.

Remark. If we define a metric

d(X , Y ) :=∞∑

n=1

1

2n E[(X − Y )∗n ∧ 1]

then u.c.p. convergence is compatible with this metric. We will denote by Ducp,Lucp, and Sucp the topological spaces D, L, and S endowed with the u.c.p. topology.The u.c.p. topology is weaker than the uniform topology. It is important to notethat Ducp and Lucp are complete metric spaces.

2.4.3 Theorem. S is dense in Lucp.

PROOF: Let Y ∈ L and define Rn := inft : |Yt | > n. Then Y Rn → Y u.c.p.(exercise) and Y Rn is bounded by n because Y is left continuous. Thus bL is densein Lucp, so it suffices to show that S is dense in bLucp.

Assume that Y ∈ bL and define Zt := limu↓t Yn for all t ≥ 0, so that Z ∈ D. Foreach ε > 0 define a sequence of stopping times T ε0 := 0 and

T εn+1 := inft > T εn : |Zt − ZT εn|> ε.

We have seen that the T εn are stopping times because they are hitting times for thecàdlàg process Z . Also because Z ∈ D, T εn ↑ ∞ a.s. (this would not necessarilyhappen it T ε were defined in the same way but with Y ). Let

Zε :=∑

n≥0

ZT εn1[T εn ,T εn+1)

.

Then Zε is bounded and Zε → Z uniformly on compacts as ε→ 0. Let

Uε := Y010 +∑

n≥0

ZT εn1(T εn ,T εn+1]

.

Page 28: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

28 Semimartingales and stochastic integration

Then Uε → Y010 + Z− uniformly on compacts as ε→ 0. If we define

Y n,ε := Y010 +n∑

k≥1

ZT εk1(T εk ,T εk+1]

then Y n,ε → Y u.c.p. as ε→ 0 and n→∞.

2.4.4 Definition. Let H ∈ S and X be a càdlàg process. The stochastic integralprocess of H with respect to X is JX (H) := H0X0 +

∑ni=1 Hi(X Ti+1 − X Ti ) when

H = H010 +∑n−1

i=1 Hi1(Ti ,Ti+1].

Notation. We write∫

HsdXs := H ·X := JX (H) and∫ t

0HsdXs := IX t (H) = (JX (H))t

and∫∞

0HsdXs := IX (H).

2.4.5 Theorem. If X is a semimartingale then JX : Sucp→ Ducp is continuous.

PROOF: We begin by proving the weaker statement, that JX : Su→ Ducp is continu-ous. Suppose that ‖Hk‖∞→ 0. Let δ > 0 and define a sequence of stopping timesT k := inft : |(Hk · X )t | ≥ δ. Then Hk1[0,T k] ∈ S and ‖Hk1[0,T k]‖∞ → 0 becausethis already happens for (Hk)k≥1. Let t ≥ 0 be given.

P[(Hk · X )∗t ≥ δ]≤ P[(Hk · X )∗t∧T k ≥ δ] = P[IX t (Hk1[0,T k])≥ δ]→ 0

since X is a semimartingale. Therefore JX (Hk)→ 0 u.c.p.Assume Hk → 0 u.c.p. Let ε > 0, t > 0, and δ > 0 be given. There is η such

that ‖H‖∞ < η implies P[JX (H)∗t ≥ δ] < ε. Let Rk := infs : |Hks | > η. Then

Rk →∞ a.s., Hk := Hk1[0,Rk]→ 0 u.c.p., and ‖Hk‖ ≤ η.

P[(Hk · X )∗t ≥ δ] = P[(Hk · X )∗t ≥ δ; Rk < t] + P[(Hk · X )∗t ≥ δ; Rk ≥ t]

≤ P[Rk < t] + P[(Hk · X )∗t ≥ δ]< 2ε

for k large enough.

If H ∈ L and (Hn)n≥1 ⊆ S are such that Hn → H u.c.p. then (Hn)n≥1 has theCauchy property for the u.c.p. metric. Since JX is continuous, (JX (Hn))n≥1 alsohas the Cauchy property. Since Ducp is complete there is JX (H) ∈ D such thatJX (Hn) → JX (H) u.c.p. We say that JX (H) is the stochastic integral of H withrespect to X .

2.4.6 Example. Let B be standard Brownian motion and let (Πn)n≥1 be a se-quence of partitions of [0, t] with mesh size converging to zero. Define

Bn :=∑

Πn

Bt i1(t i ,t i+1],

Page 29: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.5. Properties of the stochastic integral 29

so that Bn→ B u.c.p. (check this as an exercise).

(JB(Bn))t =

Πn

Bt i(Bt i+1

− Bt i)

=1

2

Πn

(B2t i+1− B2

t i)−

1

2

Πn

(Bt i+1− Bt i

)2

=1

2B2

t −1

2

Πn

(Bt i+1− Bt i

)2

(p)−→

1

2B2

t −1

2t

Hence∫ t

0BsdBs =

12Bs −

12

t.

2.5 Properties of the stochastic integral

2.5.1 Theorem. Let H ∈ L and X be a semimartingale.(i) If T is a stopping time then (H · X )T = (H1[0,T] · X ) = H · (X T ).

(ii) ∆(H · X ) is indistinguishable from H · (∆X ).(iii) If Q P then H ·Q X is indistinguishable from H ·P X .(iv) If Q=

k≥1λk Pk with λk ≥ 0 and∑

k≥1λk = 1 then H ·Q X is indistinguish-able from H ·Pk

X for any k for which λk > 0.(v) If X is both a P and Q semimartingale then there is H · X that is a version of

both H ·P X and H ·Q X .(vi) If G is another filtration and H ∈ L(G)∩L(F) then H ·G X = H ·F X .

(vii) If X has paths of finite variation on compacts then H · X is indistinguishablefrom the path-by-path Lebesgue-Stieltjes integral.

(viii) Y := H · X is a semimartingale and K · Y = (KH) · X for all K ∈ L.

PROOF (OF (VIII)): It can be shown using limiting arguments that K · (H · X ) =(KH) · X when H, K ∈ S. To prove that Y = H · X is a semimartingale when H ∈ Lthere are two steps. First we show that if K ∈ S then K · Y = (KH) · X (and thismakes sense even without knowing that Y is a semimartingale). Fix t > 0 andchoose Hn ∈ S with Hn → H u.c.p. We know Hn · X → H · X u.c.p. because X is asemimartingale. Therefore there is a subsequence such that (Y nk − Y )∗t → 0 a.s.,where Y nk := Hnk ·X . Because of the a.s. convergence, (K ·Y )t = limk→∞(K ·Y nk)ta.s. Finally, K · Y nk = (KHnk) · X since Hn, K ∈ S, so

K · Y = u.c.p.-limk→∞(KHnk) · X = (KH) · X

since KHnk → KH u.c.p. and X is a semimartingale.For the second step, proving that Y is a semimartingale, assume that Gn → G

in Su. We must prove that (Gn · Y )t → (G · Y )t in probability for all t. We haveGnH → GH in Lucp, so

limn→∞

Gn · Y = limn→∞(GnH) · X = (GH) · X = G · Y.

Since this convergence is u.c.p. this proves the result.

Page 30: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

30 Semimartingales and stochastic integration

2.5.2 Theorem. If X is a locally square integrable martingale and H ∈ L thenH · X is a locally square integrable martingale.

PROOF: It suffices to prove the result assuming X is a square integrable martingale,since if (Tn)n≥1 is a localizing sequence for X then (H ·X )Tn = H ·X Tn , so we wouldhave that H · X is a local locally square integrable martingale. Conclude withTheorem 1.6.2, since the collection of locally square integrable martingale is stableunder stopping. Furthermore, we may assume that H is bounded because leftcontinuous processes are locally bounded (a localizing sequence is Rn := inft :|Ht |> n) and that X0 = 0.

Construct, as in Theorem 2.4.3, Hn ∈ S such that Hn → H u.c.p. By construc-tion the Hn are bounded by ‖H‖∞. Let t > 0 be given. It is easy to see that Hn · Xis a martingale and

E[(Hn · X )2] = E n∑

i=1

Hni

X Ti+1t − X Ti

t

2

≤ ‖H‖2∞E[X

2t ]<∞.

(The detail are as in the proof of Theorem 2.3.2.) It follows that ((Hn · X )t)n≥1 isu.i. Since we already know (Hn · X )t → (H · X )t in probability, the convergence isactually in L1. This allows us to prove that H · X is a martingale, and by Fatou’slemma

E[(H · X )2t ]≤ lim infnE[(Hn · X )2t ]≤ ‖H‖

2∞E[X

2t ]<∞

so H · X is a square integrable martingale.

2.5.3 Theorem. Let X be a semimartingale. Suppose that (Σn)n≥1 is a sequenceof random partitions, Σn : 0 = T n

0 ≤ T n1 ≤ · · · ≤ T n

knwhere the T n

k are stoppingtimes such that

(i) limn→∞ T nkn=∞ a.s. and

(ii) supk |T nk+1 − T n

k | → 0 a.s.

Then if Y ∈ L (or D) then∑

k YT nk(X T n

k+1 − X T nk )→

Y−dX .

Remark. The hard part of the proof of this theorem is that the approximations toY do not necessarily converge u.c.p. to Y (if they did then the theorem would bea trivial consequence of the definition of semimartingale).

2.5.4 Example. Let Mt = Nt − λt be a compensated Poisson process (a martin-gale) and let H = 1[0,T1) (a bounded process in D), where T1 is the first jump time

of the Poisson process. Then∫ t

0HsdMs = −λ(t ∧ T1), which is not a local mar-

tingale. Examples of this kind are part of the reason that we want our integrandsfrom L.

Page 31: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.6. The quadratic variation of a semimartingale 31

2.6 The quadratic variation of a semimartingale

2.6.1 Definition. Let X and Y be semimartingales. The quadratic variation of Xand Y is [X , Y ] = ([X , Y ]t)t≥0, defined by

[X , Y ] := X Y −∫

X−dY −∫

Y−dX .

Write [X ] := [X , X ]. The polarization identity is sometimes useful.

[X , Y ] =1

2([X + Y ]− [X ]− [Y ])

2.6.2 Theorem.(i) [X ]0 = X 2

0 and ∆[X ] = (∆X )2.(ii) If (Σn)n≥1 are as in Theorem 2.5.3 then

X 20 +∑

k

(X T nk+1 − X T n

k )2u.c.p.−→ [X ].

(iii) [X ] is càdlàg, adapted, and increasing.(iv) [X , Y ] is of finite variation, [X , Y ]0 = X0Y0, ∆[X , Y ] = ∆X∆Y , and

X0Y0 +∑

k

(X T nk+1 − X T n

k )(Y T nk+1 − Y T n

k )u.c.p.−→ [X , Y ].

(v) If T is any stopping time then

[X T , Y ] = [X , Y T ] = [X T , Y T ] = [X , Y ]T .

PROOF:(i) Recall that X0− := 0, so

∫ 0

0Xs−dXs = 0 and [X ]0 = (X0)2. For any t > 0,

(∆X t)2 = (X t − X t−)

2 = X 2t + X 2

t− − 2X t X t−

= X 2t − X 2

t− − 2X t−∆X t

= (∆X 2)t − 2∆(X− · X )t=∆(X 2 − 2(X− · X ))t =∆[X ]t

(ii) Fix n and let Rn := supk T nk . Then (X 2)Rn → X 2 u.c.p., so apply Theo-

rem 2.5.3 and the summation-by-parts trick.(iii) Let’s see why [X ] is increasing. Fix s < t rational. If we can prove that

[X ]s ≤ [X ]t a.s. then we are done since [X ] is càdlàg. Use partitions (Σn)n≥1in Theorem 2.5.3 that include s and t. Excluding terms from the sum makesit smaller since all of the summands are squares (hence non-negative), so[X ]s ≤ [X ]t a.s.

Page 32: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

32 Semimartingales and stochastic integration

2.6.3 Corollary.(i) If X is a continuous semimartingale of finite variation then [X ] is constant,

i.e. [X ]t = X 20 for all t.

(ii) [X , Y ] and X Y are semimartingales, so semimartingales form an algebra.

2.6.4 Theorem (Kunita-Watanabe identity). Let X and Y be semimartingales andH, K : Ω× [0,∞)→ R be F ⊗B-measurable.

∫ ∞

0

|Hs||Ks|d|[X , Y ]|s

2

≤∫ ∞

0

H2s d[X ]s

∫ ∞

0

K2s d[Y ]s.

PROOF: Use the following lemma.

2.6.5 Lemma. Let α,β ,γ : [0,∞)→ R be such that α(0) = β(0) = γ(0) = 0, α isof finite variation, β and γ are increasing, and for all s ≤ t,

(α(t)−α(s))2 ≤ (β(t)− β(s))(γ(t)− γ(s)).

Then for all measurable functions f and g and all s ≤ t,

∫ t

s

| fu||gu|d|α|u

2

≤∫ t

s

f 2u dβu

∫ t

s

g2u dγu.

The lemma can be proved with the following version of the monotone class theo-rem.

2.6.6 Theorem (Monotone class theorem II). Let C be an algebra of boundedreal valued functions. Let H be a collection of bounded real valued functionsclosed under monotone and uniform convergence. If C⊆H then bσ(C)⊆H.

Both are left as exercises.

2.6.7 Definition. Let X be a semimartingale. The continuous part of quadraticvariation, [X ]c , is defined by

[X ]t = [X ]ct + X 2

0 +∑

0<s≤t

(∆Xs)2.

X is said to be a quadratic pure jump semimartingale if [X ]c ≡ 0.

2.6.8 Examples.(i) If N is a Poisson process then [N] = N , so it is a quadratic pure jump

semimartingale.(ii) Any Lévy process with no Gaussian part is a quadratic pure jump semi-

martingale. Note that [t]≡ 0.

2.6.9 Theorem. If X is an adapted, càdlàg process of finite variation then X is aquadratic pure jump semimartingale.

Page 33: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.6. The quadratic variation of a semimartingale 33

PROOF: The Lebesgue-Stieltjes integration-by-parts formula gives us

X 2 =

X−dX +

X dX ,

and by definition of [X ] we have

X 2 = 2

X−dX + [X ].

Noting that∫

X dX =

(X− +∆X )dX =

X−dX +∑

(∆X )2

we obtain [X ] =∑

(∆X )2, so X is a quadratic pure jump semimartingale.

2.6.10 Theorem. If X is a locally square integrable martingale that is not constanteverywhere then [X ] is not constant everywhere. Moreover, X 2 − [X ] is a localmartingale and if [X ]≡ 0 then X ≡ 0.

Remark. This theorem holds when X is a local martingale, but it is harder toprove.

PROOF: We have that X 2 − [X ] = 2∫

X−dX is a locally square integrable martin-gale by Theorem 2.5.2. Assume that [X ]t = 0 for all t ≥ 0. Then X 2 is a locallysquare integrable martingale, so suppose that (Tn)n≥1 is a localizing sequence.Then E[X 2

t∧Tn] = X 2

0 = 0 for all n and all t. Thus X 2t∧Tn= 0 a.s. for all n and all t,

so X ≡ 0. If X0 is a constant other than zero then the proof applies to the locallysquare integrable martingale X − X0.

2.6.11 Corollary. Let X be a locally square integrable martingale and S ≤ T bestopping times.

(i) If [X ] is constant on [S, T]∩ [0,∞) then X is constant there too.(ii) If X has continuous paths of finite variation on (S, T ) then X is constant on[S, T]∩ [0,∞).

PROOF: Consider M := X T − X S and apply Theorems 2.6.2 and 2.6.10.

2.6.12 Corollary.(i) If X and Y are locally square integrable martingales then [X , Y ] is the

unique adapted, càdlàg process of finite variation such thata) X Y − [X , Y ] is a local martingale, andb) ∆[X , Y ] = ∆X∆Y and [X , Y ]0 = X0Y0.

(ii) If X and Y are locally square integrable martingales with no common jumpsand such that X Y is a local martingale then [X , Y ]≡ X0Y0.

Page 34: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

34 Semimartingales and stochastic integration

(iii) If X is a continuous square integrable martingale and Y is a square inte-grable martingale of finite variation then [X , Y ] ≡ X0Y0 and X Y is martin-gale.

PROOF:(i) By definition, X Y −[X , Y ] =

X−dY +∫

Y−dX , which is a local martingale.The second condition is Theorem 2.6.2(vi). Conversely, if A satisfied thetwo conditions then, by Theorem 2.6.2(vi), A− [X , Y ] is a continuous semi-martingale of finite variation null at zero. By Corollary 2.2.3, A= [X , Y ]. Fill in more of this proof.

2.6.13 Assumption. [M] always exists for a local martingale M .

2.6.14 Corollary. Let M be a local martingale. Then M is a square integrablemartingale if and only if E[[M]t] <∞ for all t ≥ 0. We have E[M2

t ] = E[[M]t]when E[[M]t]<∞.

PROOF: Suppose M is a square integrable martingale. M2− [M] is a local martin-gale null at zero by Corollary 2.6.12. Let (Tn)n≥1 be a localizing sequence. For allt ≥ 0,

E[[M]t] = limn→∞E[[M]t∧Tn

] = limn→∞E[M2

t∧Tn]≤ E[M2

t ]<∞

The first equality is the monotone convergence theorem, the second is because(M2−[M])Tn∧t is a martingale, and the inequality is the “limsup” version of Fatou’slemma. For the converse, define stopping times Tn := inft : |Mt |> n ∧ n.

sup0≤s≤t

|M Tns | ≤ n+ |∆MTn

| ≤ n+p

[M]n

Hence sups≤t |M Tns | ∈ L2 ⊆ L1 for all n and all t. By Doob’s maximal inequality. . .

2.6.15 Example (Inverse Bessel process). Let B be a three dimensional Brown-ian motion that starts at x 6= 0. It can be shown that Mt := 1/‖Bt‖ is a localmartingale. Furthermore, E[M2

t ] < ∞ for all t and limt→∞E[M2t ] = 0. This

implies that M is not a martingale (because if it were then M2 would be a sub-martingale, which contradicts that limt→∞E[M2

t ] = 0). Moreover, E[[M]t] =∞for all t > 0. It can be shown that M satisfies the SDE dMt =−M2

t dBt .

2.6.16 Corollary. Let X be a continuous local martingale. Then X and [X ] havethe same intervals of constancy a.s. (i.e. pathwise, cf. Corollary 2.6.11).

2.6.17 Theorem. Let X be a quadratic pure jump semimartingale. For any semi-martingale Y ,

[X , Y ] = X0Y0 +∑

0<s≤t

∆Xs∆Ys.

Page 35: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.7. Itô’s formula 35

PROOF (SKETCH): We know by the Kunita-Watanabe inequality that d[X , Y ]s d[X , X ]s a.s. Since [X , X ]c ≡ 0, this implies that [X , Y ]c ≡ 0, so [X , Y ] is equal tothe sum of its jumps.

2.6.18 Theorem. Let H, K ∈ L and X and Y be semimartingales.(i) [H · X , K · Y ] = (HK) · [X , Y ].

(ii) If H is càdlàg and (σn)n≥1 is a sequence of random partitions tending to theidentity then

HT ni(X T n

i+1 − X T ni )(Y T n

i+1 − Y T ni )

u.c.p.−→

Hs−d[X , Y ]s.

Remark. Part (i) is important for computations, but part (ii) is important theoret-ically. We will prove Itô’s formula using part (ii).

PROOF (OF (I)): It is enough to prove that [H · X , Y ] = H · [X , Y ] since we alreadyhave associativity. Suppose that Hn ∈ S are such that Hn → H u.c.p. DefineZn := Hn · X and Z := H · X , so that Zn→ Z u.c.p. Then

[Zn, Y ] = Y Zn −∫

Y−dZn −∫

Zn−dY

= Y Zn −∫

(Y Hn)−dX −∫

Zn−dY

→ Y Z −∫

(Y H)−dX −∫

Z−dY

= [Z , Y ]

For all n, since Hn is simple predictable,

[Zn, Y ] = [Hn · X , Y ] = Hn[X , Y ]

(check this). Finally, [X , Y ] is a semimartingale so Hn[X , Y ]→ H · [X , Y ].

2.7 Itô’s formula

We have seen that if A is continuous and of finite variation and if f ∈ C1 then

f (At) = f (A0) +

∫ t

0

f ′(As)dAs.

We also saw that if B is a standard Brownian motion then

B2t = 2

∫ t

0

BsdBs + t.

Page 36: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

36 Semimartingales and stochastic integration

Take f (x) = x2 so that we can write

f (Bt)− f (B0) =

∫ t

0

f ′(Bs)dBs +1

2

∫ t

0

f ′′(Bs)d[B]s.

This expression does not coincide with the one for continuous processes of finitevariation.

Notation. We write∫

(0,t]

HsdXs :=

∫ t

0+

HsdXs :=

∫ t

0

HsdXs −H0X0 =

∫ t

0

Hs1(0,∞)(s)dXs.

2.7.1 Theorem (Itô’s formula). If X is a semimartingale and f ∈ C2(R) then( f (X t))t≥0 is a semimartingale and

f (X t)− f (X0) =

∫ t

0+

f ′(Xs−)dXs +1

2

∫ t

0+

f ′′(Xs−)d[X ]cs

+∑

0<s≤t

( f (Xs)− f (Xs−)− f ′(Xs−)∆Xs).

In particular, if X has continuous paths then

f (X t)− f (X0) =

∫ t

0+

f ′(Xs)dXs +1

2

∫ t

0

f ′′(Xs)d[X ]s.

Additionally, if X is of finite variation then

f (X t)− f (X0) =

∫ t

0+

f ′(Xs−)dXs +∑

0<s≤t

( f (Xs)− f (Xs−)− f ′(Xs−)∆Xs),

and this last formula holds even if f is only C1.

PROOF: First assume that X is bounded by a constant K . By definition the quadraticvariation of a semimartingale is finite valued, so

0<s≤t(∆Xs)2 ≤ [X ]t <∞ a.s.Since f ′′ is continuous on the interval [−K , K] it is bounded on that interval, so∑

0<s≤t f ′′(Xs−)(∆Xs)2 <∞ a.s. and we may write Itô’s formula in the followingequivalent form.

f (X t)− f (X0) =

∫ t

0+

f ′(Xs−)dXs +1

2

∫ t

0+

f ′′(Xs−)d[X ]s

+∑

0<s≤t

( f (Xs)− f (Xs−)− f ′(Xs−)∆Xs − f ′′(Xs−)(∆Xs)2)

Fix t > 0. Taylor’s theorem states that, for x , y ∈ [−K , K],

f (y)− f (x) = f ′(x)(y − x) +1

2f ′′(x)(y − x)2 + R(x , y),

Page 37: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.7. Itô’s formula 37

where the remainder satisfies |R(x , y)| ≤ r(|x− y|)(x− y)2, where r is an increas-ing function such that limε↓0 r(ε) = 0.

Since∑

0<s≤t(∆Xs)2 is a.s. a convergent series, we can partition the jumps ofX in [0, t] into a finite set A := A(ε, t) and the remainder B := B(ε, t), with theproperty that

s∈B(∆Xs)2 ≤ ε2. A are the times of “large” jumps and B are thetimes of “small” jumps.

Suppose that (Σn)n≥1 is a sequence of random dyadic partitions of [0, t] withmesh size converging to zero, say Σn : (0 = T n

0 ≤ T n1 ≤ · · · ≤ T n

kn= t). For each n,

partition the indices 0,1, . . . , kn − 1 into the follow sets

Ln := i : (T ni , T n

i+1]∩A 6=∅Sn := i /∈ Ln : (T n

i , T ni+1]∩B 6=∅

Rn := 0,1, . . . , kn − 1 \ (Ln ∪ Sn)

Then Ln is the collection of indices i for which the ith interval in the partitioncontains a large jump of X , Sn is the collection of indices i for which the ith intervalcontains a small jump but no large jump, and Sn is the collection of indices i forwhich X is continuous on the ith interval. Write

f (X t)− f (X0) =∑

Σn

( f (XT ni+1)− f (XT n

i))

=∑

Ln

+∑

Sn

+∑

Rn

( f (XT ni+1)− f (XT n

i))

=: SnL + Sn

S + SnR

By Taylor’s theorem we can write

SnS + Sn

R =∑

Σn

f ′(XT ni)(XT n

i+1− XT n

i) +

1

2f ′′(XT n

i)(XT n

i+1− XT n

i)2

−∑

Ln

f ′(XT ni)(XT n

i+1− XT n

i) +

1

2f ′′(XT n

i)(XT n

i+1− XT n

i)2

+∑

Sn∪Rn

R(XT ni, XT n

i+1).

As n→∞,∑

Sn(XT ni+1− XT n

i)2 →

s∈B(∆Xs)2. The remainder term in the aboveexpression, call it Rn, satisfies

|Rn|=

Sn∪Rn

R(XT ni, XT n

i+1)

≤∑

Sn∪Rn

r(|XT ni− XT n

i+1|)(XT n

i+1− XT n

i)2

=∑

Sn

+∑

Rn

r(|XT ni− XT n

i+1|)(XT n

i+1− XT n

i)2

≤ r(2K)∑

Sn

(XT ni+1− XT n

i)2 + sup

Rnr(|XT n

i+1− XT n

i|)∑

Rn

(XT ni+1− XT n

i)2

Page 38: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

38 Semimartingales and stochastic integration

By the properties of r we can pick a subsequence (Σnm)m≥1 such thatFor each m≥ 1 there is nm such that

σnm∩B 6=0

(XT ni+1− X n

T ni)2 −

x∈B

(∆Xs)2

1+1

m

ε2.

You can find a sequence of partitions σm, constructed by refining σnm outside thesubintervals that don’t contain jumps, such that

σm∩A=∅

R(X T mi

, X T mi+1)→ 0

as m→∞ and ε→ 0. Work from then on with the refinements σm.

lim supm→∞

|Rnm | ≤ (1+ r(2K))ε2.

Hence

f (X t)− f (X0) = SnmL + Snm

S + SnmR

= SnmL +

Σnm

f ′(XT ni)(XT n

i+1− XT n

i) +

1

2f ′′(XT n

i)(XT n

i+1− XT n

i)2

−∑

Ln

f ′(XT ni)(XT n

i+1− XT n

i) +

1

2f ′′(XT n

i)(XT n

i+1− XT n

i)2 + Rnm

As m→∞,

Σnm

f ′(XT ni)(XT n

i+1− XT n

i)→

∫ t

0+

f ′(Xs−)dXs by Theorem 2.5.3

Σnm

1

2f ′′(XT n

i)(XT n

i+1− XT n

i)2→

1

2

∫ t

0+

f ′′(Xs−)d[X ]s by Theorem 2.6.18

and similarly,

SnL =∑

Ln

f (XT ni+1)− f (XT n

i)→

s∈A

f (Xs)− f (Xs−)

Ln

f ′(XT ni)(XT n

i+1− XT n

i)→

s∈A

f ′(Xs−)∆Xs

Ln

1

2f ′′(XT n

i)(XT n

i+1− XT n

i)2→

s∈A

1

2f ′′(Xs−)(∆Xs)

2

Finally, as ε→ 0, A(ε, t) exhausts the jumps of X . Since

f (Xs)− f (Xs−)− f ′(Xs−)∆Xs +1

2f ′′(Xs−)(∆Xs)

2

is dominated by a constant times the sum of the jumps squared. . .

Page 39: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.7. Itô’s formula 39

If X is not necessarily bounded then let Vk := inft : |X t | > k and Y k :=X1[0,Vk). Then Y k is a semimartingale, as it is a product of two semimartingales,bounded by k. (Note that Y k is not the same as X Vk−, as the former jumps to zeroat time Vk.) By the previous part,

f (Y kt )− f (Y k

0 ) =

∫ t

0+

f ′(Y ks−)dY k

s +1

2

∫ t

0+

f ′′(Y ks−)d[Y

k]cs

+∑

0<s≤t

( f (Y ks )− f (Y k

s−)− f ′(Y ks−)∆Y k

s ).

Fix ω and t and k such that Vk(ω) > t. One can rewrite the above in terms ofItô’s formula for X . (More perspicaciously, for fixed t, for each ω there is k largeenough so that Y k(ω, t) = X (ω, t) and the differentials are equal too.)

2.7.2 Theorem.(i) If X = (X 1, . . . , X n) is an n-dimensional semimartingale and f : Rn→ R has

continuous second order partial derivatives then f (X ) = ( f (X 1t , . . . , X n

t ))t≥0is a semimartingale and

f (X t)− f (X0) =n∑

i=1

∫ t

0+

∂ f

∂ x i(Xs−)dX i

s+1

2

n∑

i=1

∫ t

0+

∂ 2 f

∂ x i∂ x j(Xs−)d[X

i , X j]cs

+∑

0<s≤t

f (Xs)− f (Xs−)−n∑

i=1

∂ f

∂ x i(Xs−)∆X i

s

(ii) If X and Y are semimartingales, Z := X + iY , and f is analytic then

f (Zt)− f (Z0) =

∫ t

0+

f ′(Zs−)dZs +1

2

∫ t

0+

f ′′(Zs−)d[Z]cs

+∑

0<s≤t

( f (Zs)− f (Zs−)− f ′(Zs−)∆Zs).

where “dZ = dX + idY ” and [Z] = [X ]− [Y ] + 2i[X , Y ].

2.7.3 Definition. The Fisk-Stratonovich integral of is defined for semimartingalesX and Y by (Y− X )t :=

∫ t

0Ys− dXs :=

∫ t

0Ys−dXs +

12[X , Y ]ct .

2.7.4 Theorem.(i) If X is a semimartingale and f ∈ C3(R)

f (X t)− f (X0) =

∫ t

0+

f ′(Xs−) dXs +∑

0<s≤t

( f (Xs)− f (Xs−)− f ′(Xs−)∆Xs).

(ii) If X and Y are semimartingales and at least one is continuous then

X Y − X0Y0 = X− Y + Y− X .

Page 40: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

40 Semimartingales and stochastic integration

2.8 Applications of Itô’s formula

2.8.1 Theorem (Stochastic exponential).Let X be a semimartingale with X0 = 0. Then there exists a unique semimartingaleZ such that Zt = 1+

∫ t

0Zs−dXs. Moreover,

Zt = E (X )t := exp(X t −12[X ]ct)

s≤t

(1+∆Xs)exp(−∆Xs).

PROOF: Let’s see first that E (X ) is a semimartingale. We know that X − 12[X ]c is

a semimartingale, so exp(X − 12[X ]c) is a semimartingale by Itô’s theorem. We

need to show that∏

s≤t(1+∆Xs)e−∆Xs is well-defined and is a semimartingale.Towards this, let Ut = ∆Xs1|∆Xs |≤

12. The set s : |∆Xs| >

12 is finite a.s., so it is

enough to prove that∏

s≤t(1+ Us)e−Us is a semimartingale. For |x | ≤ 12, it can be

shown that | log(1+ x)− x | ≤ x2, whence

log

s≤t

(1+ Us)e−Us

≤∑

s≤t

| log(1+ Us)− Us|

≤∑

s≤t

U2s ≤

s≤t

(∆Xs)2 ≤ [X ]t <∞ a.s.

Hence the infinite product converges, and moreover,∏

s≤t(1 + ∆Xs)e−∆Xs is offinite variation because the series of its log-jumps is absolutely summable. Inparticular it is a quadratic pure jump semimartingale.

Now we prove that Z satisfied the integral equation. Let F(x , y) := yex ,Kt := X − 1

2[X ]c , and St :=

s≤t(1+∆Xs)e−∆Xs . Then F is C2 and Zt = F(Kt , St).By Itô’s formula,

Zt − 1=

∫ t

0+

Zs−dKs +

∫ t

0+

eKs−dSs +1

2

0+

Zs−d[K]cs

+∑

0<s≤t

(Zs − Zs− − Zs−(∆Ks)− eKs−∆Ss)

=

∫ t

0

Zs−dKs +∑

0<s≤t

eKs−(∆Ss) +1

2

0

Zs−d[X ]cs

+∑

0<s≤t

(Zs − Zs− − Zs−(∆Xs)− eKs−∆Ss)

=

∫ t

0

Zs−dKs +1

2

0

Zs−d[X ]cs =

∫ t

0

Zs−dXs

The only tricky point is to note that Zs = Zs−(1+∆Xs).

Remark.(i) If X is of finite variation then so is E (X ).

Page 41: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

2.8. Applications of Itô’s formula 41

(ii) If X is a local martingale then so is E (X ), but at this point we can only proveit if X is a locally square integrable martingale.

(iii) Let T = inft :∆X t =−1.a) E (X ) 6= 0 on [0, T )b) E (X )− 6= 0 on [0, T]c) E (X ) = 0 on [T,∞)

(iv) If X is continuous and X0 = 0 then E (X ) = exp(X − 12[X ]).

(v) If λ ∈ R and B is standard Brownian motion then E (λB) = exp(λBt −12λ2 t)

is a continuous martingale.

2.8.2 Theorem (Yor’s formula). If X and Y are semimartingales with X0 = Y0 =0 then E (X )E (Y ) = E (X + Y + [X , Y ]).

PROOF: Integration-by-parts.

Remark. If X is continuous then (E (X ))−1 = E (−X + [X ]).

2.8.3 Theorem. Let X = (X 1, . . . , X n) be an n-dimensional local martingale withvalues in D, an open subset of Rn. Assume [X i , X j] = Aδi, j , where A is someincreasing process. If u : D→ R is harmonic (resp. sub-harmonic) then u(X ) is alocal martingale (resp. sub-martingale).

2.8.4 Theorem (Lévy’s characterization of BM).A stochastic process X is a Brownian motion if and only if X is a continuous localmartingale with [X ]t = t a.s. for all t.

PROOF: Only one direction requires proof, so assume X is a continuous local mar-tingale and [X ]t = t for all t. It suffices to show that E[eiu(X t−Xs) | Fs] = e−

12

u2(t−s)

for all 0≤ s < t <∞ (exercise). Fix u ∈ R, let F(x , t) := exp(iux + 12u2 t), and let

Zt := F(X t , t). By Itô’s formula,

Zt − Z0 = iu

∫ t

0

ZsdXs +u2

2

∫ t

0

Zsds−u2

2

∫ t

0

Zsd[X ]s

Zt = 1+ iu

∫ t

0

ZsdXs

Therefore Z is a local martingale by Theorem 2.5.2. But

sups≤t|Zt |= sup

s≤t|exp(iuX t +

12u2 t)|= e

12

u2 t <∞.

so Z is a martingale, and this implies what we wanted to prove.

Page 42: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

42 Semimartingales and stochastic integration

2.8.5 Theorem (Multidimensional Lévy’s characterization).If X = (X 1, . . . , X n) is a vector of n continuous local martingales and, for all i, j,[X i , X j]t = tδi, j for all t then X is an n-dimensional Brownian motion.

2.8.6 Theorem. Let M be a continuous local martingale with M0 = 0 and suchthat limt→∞[M]t =∞. Let

(i) Ts = inft : [M]t > s,(ii) Gs :=FTs

and G= (Gs)g≥0, and(iii) Bs := MTs

.Then B is a G-Brownian motion, [M]t is G-stopping time for all t, and Mt = B[M]ta.s. for all t.

PROOF: Since [M]∞ = ∞, Ts < ∞ for all s and B is well-defined. Also, sinceTs ≤ Tu for s ≤ u, FTs

= Gs ⊆ FTu= Gu, so G really is a filtration. It is right

continuous because T is right continuous. For all s and t,

[M]t ≤ s= Ts ≥ t ∈ FTs= Gs,

so the [M]t are G-stopping times. E[[M]Ts∞] = E[[M]Ts

] = s for all s since [M] isa continuous process. By Corollary 1.4.8, M Ts is a u.i. F-martingale with E[M2

Ts] =

E[[M]Ts] = s. By the optional sampling theorem

E[Bs|Gu] = E[MTs|FTu] = MTu

= Bu

for all u ≤ s, so B is a G-martingale. One can prove that (M Ts)2 − [M]Ts is a u.i.martingale for each s (exercise). Hence

E[B2s − B2

u |Gu] = E[M2Ts−M2

Tu|GTu] = E[[M]Ts

− [M]Tu|GTu] = s− u.

This implies that (B2u − u)u≥0 is a G-martingale. By Corollary 2.6.16 M and [M]

have the same intervals of constancy, so B = MT is a continuous process. Corol-lary 2.6.12 implies that [B]u = u, so B is a Brownian motion by Lévy’s charac-terization. (See the exercises around p.99 in the text regarding time changes.)Finally, note that T[M]t ≥ t because T is the right continuous inverse of [M], andT[M]t > t if and only if [M] is constant on the interval (t, T[M]t ), which happens ifand only if M is constant on that interval. Hence B[M]t = Mt for all t.

2.8.7 Exercise. If B is a G-Brownian motion then B is also a G-Brownian motion,where Gs =

u>sGu. Note that E[Z |⋂

u>sGu] = limu↓s E[Z |Gu] (this equality isknown as Lévy’s theorem).

2.8.8 Theorem (Lévy’s stochastic area formula). Let B = (X , Y ) be a standard2-dimensional Brownian motion and let At :=

∫ t

0XsdYs−

∫ t

0YsdXs. Then E[eiuAt ] =

1/ cosh(ut) for all u and 0≤ t <∞.

Page 43: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Chapter 3

The Bichteler-Dellacherie Theoremand its connexions to arbitrage

3.1 Introduction

Classical semimartingales

3.1.1 Definition. An adapted càdlàg process S is a classical semimartingale if itcan be decomposed as S = S0+M+A, where M0 = A0 = 0, M is a local martingale,and A is of finite variation.

Remark. Being a classical semimartingale is a local property, i.e. if S is a processfor which there is a sequence of stopping times (Tn)n≥1 such that Tn→∞ a.s. andSTn is a classical semimartingale for all n then S is a classical semimartingale. SaySTn = M n + An. If one defines M :=

n≥1 M n1(Tn−1,Tn] then one can prove that Mis a local martingale and A := S−M =

n≥1 An1(Tn−1,Tn] is of finite variation.

3.1.2 Theorem (Bichteler ’79, Dellacherie ’80). For an adapted, càdlàg, real val-ued process S the following are equivalent.

(i) S is a semimartingale.(ii) S is a classical semimartingale.

The following, seemingly stronger, theorem is true. This is the theorem that isstated in the textbook and the one that we will eventually prove.

3.1.3 Theorem. For an adapted, càdlàg, real valued process S the following areequivalent.

(i) S is a semimartingale.(ii) S is decomposable (i.e. S = M + A where M is a locally square integrable

martingale and A is of finite variation.)(iii) For all β > 0 there are M and A such that M is a local martingale with jumps

bounded by β and A is of finite variation and S = M + A.(iv) S is a classical semimartingale.

43

Page 44: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

44 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

So far we have seen that (ii) implies (i) (cf. Corollary 2.3.3) and (iii) implies(ii) (because a local martingale with bounded jumps is locally bounded, and inparticular is a locally square integrable martingale). The theoretically important,and difficult, results are (iv) implies (iii) and (i) implies (iv). The former is impor-tant enough to be restated as a theorem.

3.1.4 Theorem (Fundamental theorem of local martingales).If M is a local martingale then for all β > 0 there are N and B such that M = N+B,where N is a local martingale with jumps bounded by β and B is of finite variation.

Connexion to finance

Suppose that S is a price process, i.e. St is the price of an asset at time t, wherewe make no assumptions about S other than that it is càdlàg. Let Ht be thenumber of units of S that you hold at time t, so that H is a simple predictableprocess. The accumulated gains and losses after following the strategy H up totime t are (H · S)t =

∫ t

0HsdSs (the integral makes sense for any S when H is a

simple predictable process). An arbitrage over the time horizon [0, T] is a strategyH such that H0 = 0, (H ·S)T ≥ 0 a.s., and P[(H ·S)T > 0]> 0. It is very importantto be able to determine when there are no possible arbitrages.

3.1.5 Definition. We say that S allows a free lunch with vanishing risk, or FLVR,for simple integrands if there is a sequence of simple predictable processes (Hn)n≥1such that both of the following conditions hold.(FL) (Hn · S)+T does not converge to zero in probability.(VR) limn→∞ ‖(Hn · S)−‖u = 0.Alternatively, S satisfies no free lunch with vanishing risk, or NFLVR, for simple inte-grands if, for all sequences of simple predictable processes (Hn)n≥1, (VR) implies(Hn · S)+T → 0 in probability.

It is clear that if S admits an arbitrage then S allows FLVR. In 1994 Delbaenand Schachermayer proved that NFLVR for simple integrands implies S is a semi-martingale. Their proof relies on the Bichteler-Dellacherie theorem.

3.1.6 Definition. We say that S allows free lunch with vanishing risk for simpleintegrands with little investment, or FLVR+LI if there is a sequence of simple pre-dictable processes (Hn)n≥1 such that (FL) and (VR) hold and so does(LI) limn→∞ ‖Hn‖u = 0.

Alternatively, S satisfies no free lunch with vanishing risk and little investment, orNFLVR+LI, for simple intgrands if, for all sequences of simple predictable processes(Hn)n≥1, (VR) and (LI) together imply (Hn · S)+T → 0 in probability.

3.1.7 Theorem. For a locally bounded, adapted, càdlàg process S the followingare equivalent.

(i) S satisfies NFLVR+LI.(ii) S is a classical semimartingale.

Page 45: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.2. Proofs of Theorems 3.1.7 and 3.1.8 45

3.1.8 Theorem. For an adapted càdlàg process S the following are equivalent.(i) For all sequences (Hn)n≥1 of simple predictable processes,

a) limn→∞ ‖Hn‖u = 0, i.e. (LI)b) limn→∞ sup0≤t≤T (H

n · S)−t = 0 in probability (this is slightly weakerthan (VR)).

together imply (Hn · S)+T → 0 in probability.(ii) S is a classical semimartingale.

Remark. The assertions (ii) implies (i) in both Theorems 3.1.7 and 3.1.8 followfrom (iv) implies (i) in Theorem 3.1.3, because (i) in each theorem holds triv-ially for semimartingales. Also note that (i) implies (iv) in Theorem 3.1.3 is aconsequence of Theorem 3.1.8.

3.2 Proofs of Theorems 3.1.7 and 3.1.8

First we want to show that if S is locally bounded, càdlàg, and adapted and satis-fies NFLVR+LI then S is a classical semimartingale.

Note that the collection of processes satisfying NFLVR+LI is stable under stop-ping. Since the property of being a classical semimartingale is local and S is locallybounded, we may assume that ‖S‖u ≤ 1. Without loss of generality we assumethat S0 = 0 and the time interval is [0, 1].

Notation. Let Dn denote the dyadic rationals in [0, 1] with denominator at most2n, i.e. Dn := j2−n : j = 0, . . . , 2n. Let D :=

n≥1 Dn denote the collection ofdyadic rationals in [0,1].

For all n ≥ 1 define Sn := M n + An, where M n and An are defined on the setDn inductively by An

0 := 0, M n0 = S0, and for j = 1, . . . , 2n,

Anj2−n − An

( j−1)2−n := E[S j2−n − S( j−1)2−n |F( j−1)2−n] (3.1)

M nj2−n := S j2−n − An

j2−n . (3.2)

Then M n is a martingale and An is predictable and of finite variation. The hope isthat we may pass to the limit as n→∞ in some sense. We are assuming that S islocally bounded, so there are two cases.CASE 1. If (M n)n≥1 and (An)n≥1 remain bounded in some sense then, by using

Komlós’ lemma, one can pass to the limit to obtain a local martingale M anda predictable process A of finite variation such that S = M + A.

CASE 2. If (M n)n≥1 or (An)n≥1 does not remain bounded then neither does theother, and one can construct a sequence of strategies witnessing that S al-lows FLVR+LI. The idea is that if a predictable process can “get close to” amartingale then this leads to an arbitrage.

3.2.1 Theorem. Let ( fn)n≥1 be a sequence of random variables.(KOMLÓS’ LEMMA). If ( fn)n≥1 is bounded in L1, i.e. supn≥1 ‖ fn‖1 <∞, then there is

a subsequence ( fnk)k≥1 such that 1

k( fn1+ · · ·+ fnk

) converges a.s. as k→∞.

Page 46: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

46 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

(MAZUR’S LEMMA). If ( fn)n≥1 is bounded in L2, i.e. supn≥1 ‖ fn‖2 < ∞, then thereexist gn ∈ conv( fn, fn+1, . . . ) such that (gn)n≥1 converges a.s. and in L2.

PROOF (OF MAZUR’S LEMMA): Let H be the Hilbert space generated by the fn’s. Forall n define Kn = conv( fn, fn+1, . . . ), where the closure is taken in the L2-norm. Infact, Kn coincides with the weak closure of conv( fn, fn+1, . . . ) by a theorem fromfunctional analysis. By the Banach-Alaoglu theorem we know that Kn is weaklycompact for all n. Since Kn+1 ⊆ Kn for all n,

n≥1 Kn 6=∅ by the finite intersectionproperty of compact sets. Hence there is a sequence of gn ∈ conv( fn, fn+1, . . . )such that (gn)n≥1 converges in L2 to an element of

n≥1 Kn. By passing to asubsequence we may assume that (gn)n≥1 converges a.s.

Remark. We can apply Mazur’s lemma to countably many sequences simultane-ously. More precisely, suppose that ( f m

n )n≥1 is bounded in L2 for all m. Thenfor all n there are convex weights (λn

n, . . . ,λnNn), independent of m, such that the

sequences (λnn f m

n + · · ·+λnNn

f mn )n≥1 converge a.s. and in L2 for all m.

3.2.2 Proposition. Let S = (St)0≤t≤1 be càdlàg and adapted, with S0 = 0 andsuch that ‖S‖u ≤ 1 and S satisfies NFLVR+LI. Let An and M n be as defined in (3.1)and (3.2). For all ε > 0 there is C > 0 and a sequence of stopping times (ρn)n≥1such that, for all n,

(i) ρn takes values in Dn ∪ ∞.(ii) P[ρn <∞]< ε.

(iii) The stopped processes An,ρn and M n,ρn satisfy, for all n, ‖M n,ρn1 ‖2

L2 ≤ C and

TV(An,ρn) :=2n∑

j=1

An,ρn

j2−n − An,ρn

( j−1)2−n

≤ C .

3.2.3 Lemma. Under the same assumptions as in Proposition 3.2.2, with

Qn :=2n∑

j=1

(S j2−n − S( j−1)2−n)2,

the sequence (Qn)n≥1 is bounded in probability (i.e. for all α > 0 there is M suchthat P[Qn ≥ M]< α for all n).

PROOF: For all n, let Hn :=−∑2n

j=1 S( j−1)2−n1(( j−1)2−n, j2n], a simple predictable pro-

cess. Recall that −a(b− a) = 12(a− b)2 + 1

2(a2 − b2).

(Hn · S)t =−2n∑

j=1

St∧( j−1)2−n(St∧ j2−n − St∧( j−1)2−n)

=1

2

2n∑

j=1

(St∧ j2−n − St∧( j−1)2−n)2 +1

2(S2

0 − S2t )

Page 47: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.2. Proofs of Theorems 3.1.7 and 3.1.8 47

=1

2Qn +

1

2(S1

0 − S2t )

Clearly ‖Hn‖u ≤ 1, and (Hn · S)t ≥ −12

for all t because ‖S‖u ≤ 1 and Qn isnonnegative. Assume for contradiction that (Qn)n≥1 is not bounded in L0. Thenthere is α > 0 such that for all m> 0 there is nm such that P[(Hnm · S)1 ≥ m]≥ α.Whence (Hnm/m)m≥1 would be a FLVR+LI.

For c > 0 define a sequence of stopping times

σn(c) := inf

k

2n :k∑

j=1

(S j2−n − S( j−1)2−n)2 ≥ c− 4

.

Given ε > 0 there is c1 such that P[σn(c1)<∞]< ε/2 by Lemma 3.2.3.

3.2.4 Lemma. Under the same assumptions as Proposition 3.2.2, the stoppedmartingales M n,σn(c1) satisfy ‖M n,σn(c1)

1 ‖2L2 ≤ c1.

PROOF: For n≥ 1 and k = 1, . . . , 2n, since the Ans are predictable and the M ns aremartingales,

E[(Sσn(c1)k2−n − Sσn(c1)

(k−1)2−n)2] = E[(Mσn(c1)k2−n −Mσn(c1)

(k−1)2−n)2] +E[(Aσn(c1)k2−n − Aσn(c1)

(k−1)2−n)2]

≥ E[(Mσn(c1)k2−n )2 − (M

σn(c1)(k−1)2−n)2]

Write E[(Mσn(c1)1 )2] as a telescoping series and simplify to get

E[(Mσn(c1)1 )2] =

k2−n≤σn(c1)

E[(Sσn(c1)k2−n − Sσn(c1)

(k−1)2−n)2] +E[(Sσn(c1) − Sσn(c1)−2−n)2]

≤ (c1 − 4) + 22 = c1.

3.2.5 Lemma. Let V n := TV(An,σn(c1)) =∑2n(σn(c1)∧1)

i=1 |Anj2−n −An

( j−1)2−n |. Under theassumptions of Proposition 3.2.2 the sequence (V n)n≥1 is bounded in probability.

PROOF: Assume for contradiction that (V n)n≥1 is not bounded in probability. Thenthere is α > 0 such that for all k there is nk such that P[V nk ≥ k] ≥ α. For n ≥ 1define

bnj−1 := sign

An,σn(c1)j2−n − An,σn(c1)

( j−1)2−n

∈ F( j−1)2−n

and Hn(t) :=∑2n

j=1 bnj−11(( j−1)2n, j2−n](t). Then ‖Hn‖u ≤ 1 and

(Hn,σn(c1) · S)t =∑

j≤bt2nc

bnj2−n

Sσn(c1)j2−n − Sσn(c1)

( j−1)2−n

+ bnbt2nc

Sσn(c1)t − Sσn(c1)

bt2nc

≥ (Hn,σn(c1) · An)bt2nc2−n + (Hn,σn(c1) ·M n)bt2nc2−n − 2

Page 48: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

48 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

and at time t = 1 we have (Hn,σn(c1) · S)1 = V n + (Hn,σn(c1) ·M n)1. But the secondsummand is bounded in L2 (it is at most c1 by Lemma 3.2.4), so we conclude that(Hn,σn(c1) · S)1 is not bounded in probability.

Define a sequence of stopping times

ηn(c) := inf

j

2n : |(Hn,σn(c1) ·M n) j2−n ≥ c

.

Because E[(sup1≤ j≤2n((hn,σn(c1) · M n) j2−n)2] ≤ 4c1 by Doob’s sub-martingale in-equality, (Hn,σn(c1) · M n) is bounded in probability. Therefore there is c′ > 0 suchthat P[ηn(c′) < ∞] ≤ α/2. Note that Hn,σn(c1)∧ηn(c′) · S is (uniformly) boundedbelow by c′. We claim (Hn,σn(c1)∧ηn(c′) · S)1 is not bounded in probability. Indeed,for any n and any k,

α≤ P[(Hn,σn(c1)∧ηn(c′) · S)1 ≥ k]

≤ P[(Hn,σn(c1) · S)1 ≥ k,ηn(c′) =∞] + P[ηn(c

′)<∞].

Since P[ηn(c′)<∞]≤ α/2, the probability of the other event is at least α/2. Thisgives the desired contradiction because it is now easy to construct a FLVR+LI.

PROOF (OF PROPOSITION 3.2.2): Define a sequence of stopping times

τn(c) := inf

k

2n :k∑

j=1

|Anj2−n − An

( j−1)2−n | ≥ c

.

By Lemma 3.2.5 there is c2 such that P[τn(c2)<∞]< ε/2. Take C := c1 ∨ c2 andρn := σn(c1)∧τn(c2).

3.2.6 Lemma. Let f , g : [0,1] → R be measurable functions, where f is leftcontinuous and takes finitely many values. Say f =

∑Kk=1 f (sk)1(sk−1,sk]. Define

( f · g)(t) :=K∑

k=1

f (sk−1)(g(sk)− g(sk−1)) + f (sk(t))(g(t)− g(sk(t)))

where k(t) is the biggest of the k such that sk less than or equal to t. Then for allpartitions 0≤ t0 ≤ · · · ≤ tM ≤ 1,

M∑

i=1

|( f · g)(t i)− ( f · g)(t i−1)| ≤ 2 TV( f )‖g‖∞ + M∑

i=1

|g(t i)− g(t i−1)|

‖ f ‖∞.

3.2.7 Proposition. Let S = (St)0≤t≤1 be càdlàg and adapted, with S0 = 0 and suchthat ‖S‖u ≤ 1 and S satisfies NFLVR+LI. For all ε > 0 there is C and a [0,1]∪∞valued stopping time α such that P[α < ∞] < ε and sequences (Mn)n≥1 and(An)n≥1 of continuous time càdlàg processes such that, for all n,

(i) An0 =Mn

0 = 0

Page 49: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.2. Proofs of Theorems 3.1.7 and 3.1.8 49

(ii) Sα = An,α +Mn,α

(iii) Mn,α is a martingale with ‖Mn,α1 ‖

2L2 ≤ C

(iv)∑2n

j=1 |An,αj2−n −An,α

( j−1)2−n | ≤ C .

PROOF: Let ε > 0 be given. Let C , M n, An, and ρn be as in Proposition 3.2.2.Extend M n and An to all t ∈ [0, 1] by defining M n

t := E[M n1 |Ft] and An

t = St−Mt .Note that the extended An is no longer necessarily predictable, and currently weonly have control of the total variation of An,ρn over Dn, i.e.

2n(ρn∧1)∑

j=1

|Anj2−n − An

( j−1)2−n | ≤ C .

Notice that, for t ∈ (( j− 1)2−n, j2−n],

Ant = St −M n

t

= St −E[M nj2−n |Ft]

= St −E[S j2−n − Anj2−n |Ft]

= Anj2−n − (E[S j2−n |Ft]− St)

From this and ‖S‖u ≤ 1 it follows that ‖Ant − An

j2−n‖∞ ≤ 2, so ‖An,ρn‖u ≤ C + 2.How do we find the “limit” of sequence of stopping times (ρn)n≥1? The trick

is to define Rn := 1[0,ρn∧1], a simple predictable process, and note that stopping atρn is like integrating Rn, i.e. An,ρn = Rn · An and M n,ρn = Rn ·M n. We have that

1≥ E[Rn1] = E[1ρn=∞] = 1− P[ρn <∞]≥ 1− ε.

Apply Komlós’ lemma to (Rn1)n≥1 to obtain convex weights (µn

n, . . . ,µnNn) such that

Rn :=∞∑

i=n

µni Ri

1→ R1 a.s. as n→∞

By the dominated convergence theorem, E[R1]≥ 1− ε. Observe that

Rn · S =∞∑

i=n

µni (R

i ·M i)

︸ ︷︷ ︸

L2 norm ≤p

C

+∞∑

i=n

µni (R

i · Ai)

︸ ︷︷ ︸

TV over Dn is ≤ C

Define αn := inft : Rnt ≤

12. Each Rn is a left continuous, decreasing process.

In particular, Rnαn≥ 1

2> 0, so we can divide by this quantity. We claim that

P[αn <∞]< ε. Indeed, on the event [αn <∞], Rn1 ≤ Rn

αn+≤ 1

2so

ε ≥ E[1−Rn1]≥ E[(1−Rn

αn+)1αn<∞]≥

1

2P[αn <∞].

Page 50: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

50 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

Define new processes T nt := 1[0,αn](t)/R

nt . Then ‖T n‖u ≤ 2 and T n · (Rn · S) = Sαn .

Thus we define Mn and An by

Sαn = T n · ∞∑

i=n

µni (R

i ·M i)

+ T n · ∞∑

i=n

µni (R

i · Ai)

=: Mn +An.

The total variation of T n over Dn is bounded by 3. By Lemma 3.2.6,

2n∑

j=1

|Anj2−n −An

( j−1)2−n | ≤ 2TVn(Tn)

∞∑

i=n

µni (R

i · Ai)

+ ‖T n‖∞ TVn

∞∑

i=n

µni (R

i · Ai)

≤ 6(C + 2) + 2C

That ‖Mn1‖

2L2 ≤ C follows from the fact that ‖M n,αn

1 ‖2L2 ≤ C . To finish the proof,

we show that there is a subsequence (αnk)k≥1 such that α := infk αnk

satisfiesP[α <∞]≤ 4ε. We know P[R1 ≤

23]≤ 3ε because E[R1]≥ 1− ε. Since Rn

1 → R1

a.s. there is a subsequence such that P[|Rn1 −R1| ≥

115]≤ ε2−k. Finally,

P[α <∞]≤ P

infk

Rnk1 ≤

3

5

≤ 3ε+ P

infk

Rnk1 ≤

3

5,R1 >

2

3

≤ 3ε+∞∑

k=1

P

Rnk1 ≤

3

5,R1 >

2

3

≤ 3ε+∞∑

k=1

P

|Rnk1 −R1| ≥

1

15

≤ 4ε

Therefore (Mn)n≥1, (An)n≥1, and α have the desired properties.

Remark. One thing to take away from this is that if you need to take a “limit ofstopping times” then one way to do it is to turn the stopping times into processesand take the limits of the processes.

PROOF (OF (I) IMPLIES (II) IN THEOREM 3.1.7):We may assume the hypothesis of Proposition 3.2.7. Let ε > 0 and take C , α,(Mn)n≥1, and (An)n≥1 as in Proposition 3.2.7. Apply Komlós’ lemma to find convexweights (λn

n, . . . ,λnNn) such that

λnnMn,α

1 + · · ·+λnNn

MNn,α1 →M1

λnnAn,α

t + · · ·+λnNn

ANn,αt → At

Page 51: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.2. Proofs of Theorems 3.1.7 and 3.1.8 51

for all t ∈D, where the convergence is a.s. (and also in L2 for M). For all n,

2n∑

j=1

|An,αj2−n −An,α

( j−1)2−n | ≤ C

so the total variation of A over D is bounded by C . Further, we have Sα =Mt +At(where Mt = E[M1|Ft]). A is càdlàg on D, so define it on all of [0,1] to make itcàdlàg. M is an L2 martingale so it has a càdlàg modification. Since P[α <∞]< εand ε > 0 was arbitrary, and the class of classical semimartingales is local, S mustbe a classical semimartingale.

Remark. On the homework there is an example of a (not locally bounded) adaptedcàdlàg process that satisfies NFLVR and is not a classical semimartingale.

PROOF (OF (I) IMPLIES (II) IN THEOREM 3.1.8):We no longer assume that S is locally bounded. The trick is to leverage the re-sult for locally bounded processes by subtracting the “big” jumps from S. Assumewithout loss of generality that S0 = 0 and define Jt :=

s≤t∆Ss1|∆Ss |≥1. ThenX := S− J is an adapted, càdlàg, locally bounded process. We will show that The-orem 3.1.8(i) for S implies NFLVR+LI for X , so that we may apply Theorem 3.1.7to X . Then since J is of finite variation, this will then imply S is a classical semi-martingale.Suppose Hn ∈ S are such that ‖Hn‖u→ 0 and ‖(Hn ·X )−‖u→ 0. We need to provethat (Hn · X )T → 0 in probability. First we will show that ‖(Hn · S)−‖u→ 0.

sup0≤t≤T

(Hn · S)−t ≤ sup0≤t≤T

(Hn · X )−t + sup0≤t≤T

|(Hn · J)t |

≤ sup0≤t≤T

(Hn · X )−t + (‖Hn‖∞ · TV(J))T

→ 0 by the assumptions on Hn

By (i), (Hn · S)T → 0 in probability. Since (Hn · J) → 0 in probability (as J isa semimartingale), we conclude that (Hn · X )T = (Hn · S)T − (Hn · J)T → 0 inprobability. Therefore X satisfies NFLVR+LI.

Remark.(i) S satisfies NFLVR if ‖(Hn·S)−‖u→ 0 implies (Hn·S)T → 0 in probability. This

is equivalent to the statement that (H ·S)T : H ∈ S, H0 = 0,‖(H ·S)−‖u ≤ 1is bounded in probability. The latter is sometimes called no unbounded profitwith bounded risk or NUPBR.

(ii) It is important to note that this definition of NFLVR does not imply that thereis no arbitrage. The usual definition of NFLVR in the literature is as follows.Let K := (H · S)T : H ∈ S, H0 = 0 and let C := f ∈ L∞ : f ≤ g for someg ∈K. It is said that there is NFLVR if and only if C∩ L∞+ = 0 (the closureis taken in the norm topology of L∞). This is equivalent to NUPBR+NA,(where NA is no arbitrage, the condition that K∩ L∞+ = 0).

Page 52: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

52 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

3.3 A short proof of the Doob-Meyer theorem

3.3.1 Definition. An adapted process S is said to be of class D if the collection ofrandom variables Sτ : τ is a finite valued stopping time is uniformly integrable.

3.3.2 Theorem (Doob-Meyer for sub-martingales). Let S = (St)0≤t≤T be a sub-martingale of class D. Then S can be written uniquely as S = S0+M +A where Mcàdlàg + predictable implies what?

Note that *-martingale means càdlàg. is a u.i. martingale with M0 = 0 and A is a càdlàg, increasing, predictable processwith A0 = 0.

We will require the following Komlós-type lemma to obtain a limit in the proofof the theorem.

3.3.3 Lemma. If ( fn)n≥1 is a u.i. sequence of random variables then there aregn ∈ conv( fn, fn+1, . . . ) such that (gn)n≥1 converges in L1.

PROOF: Define f kn = fn1| fn|≤k for all n, k ∈ N. Then f k

n ∈ L2 for all n and k sincethey are bounded random variables. As noted before, there are convex weights(λn

n, . . . ,λnNn) and f k ∈ L2 such that λn

n f kn + · · ·+ λ

nNn

f kNn→ f k ∈ L2. Since ( fn)n≥1

is u.i.,The details need to be filled in here.

limk→∞

supn‖(λn

n f kn + · · ·+λ

nNn

f kNn)− (λn

n fn + · · ·+λnNn

fNn)‖1 = 0

which implies that (λnn fn + · · ·+λn

NnfNn)n≥0 is Cauchy in L1.

PROOF (OF THE DOOB-MEYER THEOREM FOR SUB-MARTINGALES):Assume without loss of generality that T = 1. Subtracting the u.i. martingale(E[S1|Ft])0≤t≤1 from S we may further assume that S1 = 0 and St ≤ 0 for all t.Define An

0 = 0 and, for t ∈Dn,

Ant − An

t−2−n := E[St − St−2−n |Ft−2−n]

M nt := St − An

t .

An = (Ant )t∈Dn

is predictable and it is increasing since S is a sub-martingale. Fur-thermore, M n

1 = −An1 since S1 = 0, and in fact M n

t = −E[An1|Ft] for all t ∈ Dn.

Therefore for every Dn valued stopping time τ,

Sτ =−E[An1|Fτ] + An

τ.

We would like to show that (An1)n≥1 is u.i. (and hence that (M n

1 )n≥1 is u.i.). Forc > 0 define τn(c) := inf( j−1)2−n : An

j2−n > c∧1, which is a stopping time sinceAn is predictable. Then Aτn(c) ≤ c and An

1 > c= τn(c)< 1, so

E[An11An

1>c] = E[An11τn(c)<1]

= E[E[An1|Fτn(c)]1τn(c)<1]

Page 53: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.3. A short proof of the Doob-Meyer theorem 53

= E[(−Sτn(c) + Anτn(c))1τn(c)<1]

≤−E[Sτn(c)1τn(c)<1] + c P[τn(c)< 1]

Also, τn(c)< 1 ⊆ τn(c/2)< 1, so

−E[Sτn(c/2)1τn(c/2)<1] = E[(An1 − An

τn(c/2))1τn(c/2)<1]

≥ E[(An1 − An

τn(c/2))1τn(c)<1]

≥c

2P[τn(c)< 1]

Whence E[An11An

1>c]≤−2E[Sτn(c/2)1τn(c/2)<1]−E[Sτn(c)1τn(c)<1]We may now ap-ply the class D assumption on S to show that (An

1)n≥1 is u.i., provided that we canshow P[τn(c)< 1]→ 0 as c→∞ uniformly in n. But, for any n,

P[τn(c)< 1] = P[An1 > c]≤

1

cE[An

1] =−1

cE[M n

1 ] =−1

cE[S0]

which goes to zero as c→∞. Therefore (An1)n≥1, and hence also (M n

1 )n≥1, are u.i.sequences of random variables.Extend M n to all t ∈ [0,1] by defining M n

t := E[M n1 |Ft]. By Lemma 3.3.3 there

are convex weights λnn, . . . ,λn

Nnand M ∈ L1 such that the sequence of martingales

Mn := λnnM n + · · ·+λn

NnM Nn satisfies Mn

1 → M in L1. Then Mnt → E[M |Ft] =: Mt Fill in the details (use Jensen’s in-

equality?).as well.Extend An to on [0,1] by defining it to be

t∈DnAn

t 1(t−2−n,t]. Note that An

extended in this way is a predictable process. Define An := λnnAn + · · ·+ λn

NnANn

and At := St −Mt . For all t ∈D,

Ant := (St −Mn

t )→ At in L1

and, by passing to a subsequence, we may assume the convergence is a.s. This im-plies in particular that (At)t∈D is increasing, so by right continuity, so is (At)t∈[0,1].Further, An is predictable for all n. If we can prove that lim supn An

t (ω) = At(ω)a.s. for all t ∈ [0, 1] then this would imply that A is a predictable process and wewould be done.

By right continuity lim supn Ant ≤ At for all t and limn→∞An

t = At if A is contin-uous at t. Since A is càdlàg there are countably many jump times. To prove whatwe want it suffices to prove, for all stopping times τ, limsupn An

τ = Aτ a.s. Fix astopping time τ. Since An

τ ≤ An1 → A1 in L1, we may apply Fatou’s Lemma to get

lim infnE[An

τ]≤ limsupnE[An

τ]≤ E[limsupn

Anτ]≤ E[Aτ]

Therefore it is actually enough to show that limn→∞E[Anτ] = E[Aτ]. Let σn :=

inft ∈Dn : t ≥ τ, a stopping time. Note that σn ↓ τ as n→∞. By the definitionof An, An

τ = Anσn

, so

limn→∞E[An

τ] = limn→∞E[An

σn]

Page 54: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

54 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

= limn→∞E[Sσn

]−E[Mσn] discrete Doob-Meyer

= E[Sτ]−E[M0] since S is class D

= E[Aτ].

3.4 Fundamental theorem of local martingales

3.4.1 Definition. A stopping time T is predictable there there is a sequence ofstopping times (Tn)n≥1, called an announcing sequence, such that Tn < T on theevent T > 0 and Tn ↑ T . A stopping time T is said to be totally inaccessible ifP[T = S <∞] = 0 for all predictable stopping times S.

3.4.2 Example. Suppose that X is a continuous process. T := inft : X t ≥ K ispredictable and Tn := inft : X t ≥ K − 1

n is an announcing sequence. The first

jump time of a Poisson process is totally inaccessible.

3.4.3 Theorem. If S is a càdlàg sub-martingale of class D and S jumps only attotally inaccessible stopping times then S can be written uniquely as S = S0+M+A,where M is a u.i. martingale with M0 = 0, and A is continuous, increasing, andA0 = 0.

PROOF: This is Theorem 10 in Chapter III of the textbook. The proof of uniquenessin this case is easy, since if M + A = S = M ′ + A′ are two decompositions thenA− A′ is a continuous martingale of finite variation null at zero. Hence A− A′ ≡ 0by Theorem 2.6.10. See the textbook for the proof of existence.

Remark. A stronger statement is true. If S is as in the statement of the Theo-rem 3.4.3 and S jumps at a totally inaccessible stopping time T then A in thedecomposition is continuous at T .

3.4.4 Definition. Let A be an adapted process of finite variation with A0 = 0. If Asatisfies any of the following equivalent conditions then A is said to be a naturalprocess.

(i) E[|A|∞] < ∞ (i.e. A is of integrable variation) and E[[M , A]∞] = 0 for allbounded martingales M .

(ii) E[∫∞

0Ms−dAs] = E[M∞A∞] for all all bounded martingales M . (This is an

integration-by-parts formula, in some sense.)(iii) E[

∫∞0

Ms−dAs] = E[∫∞

0MsdAs] for all all bounded martingales M .

(iv) E[∫∞

0∆Ms−dAs] = 0 for all all bounded martingales M .

(v) [M , A] is a martingale for all bounded martingales M .

Remark. A càdlàg process locally of integrable variation is of finite variation.

Page 55: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.4. Fundamental theorem of local martingales 55

3.4.5 Theorem. If A is a natural process and a martingale then A≡ 0.

PROOF: Fix a finite valued stopping time T and let H be a bounded, non-negativemartingale. It can be shown that, since A is a martingale,

E∫ T

0

Hs−dAs

= 0

using dominated convergence and the fact that it is trivial if A is a simple pro-cess (take the limit of the Riemann sums). This implies, since A is natural, thatE[HT AT ] = 0. Now take Ht := E[1AT>0|Ft], a bounded, non-negative martingale.Then

E[1AT>0AT ] = E[E[1AT>0|FT ]AT ] = E[HT AT ] = 0.

Therefore AT ≤ 0. But since E[AT ] = 0, it must be the case that AT = 0. Since thisholds for any finite valued stopping time T , A≡ 0.

3.4.6 Theorem. If A is a predictable process of integrable variation null at zerothen A is a natural process.

PROOF: This is Theorem 14 in the textbook. There a different proof in Jacod andShiryaev as well.

3.4.7 Lemma. If Y is a u.i. martingale of finite variation then Y is locally of inte-grable variation.

PROOF: Define τn := inft : |Yt | > n. For any t we have |Yt−| ≤ |Y0|+ |Y |t− andhence

(∆|Y |)t = |∆Yt | ≤ |Yt |+ |Yt−| ≤ |Yt |+ |Y0|+ |Y |t−.

Thus

|Y |τn≤ |Y |τn− + (∆|Y |)τn

≤ 2|Y |τn− + |Y0|+ |Yτn| ≤ 2n+ |Y0|+ |Yτn

| ∈ L1.

Therefore (τn)n≥1 is a localizing sequence for |Y | relative to the class increasingprocesses integrable at∞.

3.4.8 Theorem. If M is a predictable local martingale of finite variation then Mis constant.

PROOF: Suppose that (Tn)n≥1 is a localizing sequence for M . If X n := (M −M0)Tn

then X n ∈ P , X n0 = 0, X n is u.i., and it is of finite variation. By Lemma 3.4.7,

X is locally of integrable variation. Without loss of generality assume that it is ofintegrable variation. By Theorem 3.4.6, X n is natural. By Theorem 3.4.5, X n isthe zero process.

3.4.9 Corollary. The Doob-Meyer decomposition is unique.

Page 56: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

56 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

3.4.10 Theorem (Doob-Meyer without class D).Suppose that S is a super-martingale. Then S can be decomposed uniquely asS = S0 + M − A where M is a local martingale with M0 = 0 and A is càdlàg,increasing, and predictable with A0 = 0.

PROOF: Define T m := inft : |St | ≥ m ∧ m. By the optional sampling theorem,ST m ∈ L1 and ST m

is dominated by m ∨ |ST m | ∈ L1. Hence ST mis of class D,

so by the Doob-Meyer decomposition we may write ST m= M m − Am where M m

is u.i. and Am is increasing and predictable with Am0 = 0. By the uniqueness in

Doob-Meyer for class D, it must be the case that M m = M m+1 and Am = Am+1 on[0, T m]. The desired processes may be defined as M :=

∑∞m=1 M m1[Tm−1,Tm) and

A :=∑∞

m=1 Am1[Tm−1,Tm).

3.5 Quasimartingales, compensators, and the fundamentaltheorem of local martingales

3.5.1 Definition. Let X be a process and Π be a partition for which X t i∈ L1

for all t i ∈ Π. Define the variation of X along Π by varΠ(X ) := E[C(X ,Π)],where by C(X ,Π) :=

Π |X t i−E[X t i+1

|Ft i]|. We say that X is a quasimartingale if

var(X ) := supΠ varΠ(X )<∞ (and hence E[|X t |]<∞ for all t ∈ [0,∞)).

3.5.2 Theorem (Rao). X on [0,∞) is a quasimartingale if and only if X = Y − Zwhere Y and Z are nonnegative super-martingales.

PROOF: See the textbook, though the proof therein is not so clear.

3.5.3 Theorem. A quasimartingale X has a unique decomposition as X = M + Awhere M is a local martingale and A is a predictable process locally of integrablevariation.

PROOF: Existence and uniqueness follow from Theorems 3.4.10 and 3.5.2. Let’ssee that A is locally of integrable variation. Suppose X = Y−Z where Y = M Y−AY

and Z = M Z − AZ by the Doob-Meyer decomposition. Clearly A= AY − AZ , so weneed only show that AY and AZ are locally of integrable variation. Let τm be alocalizing sequence for M Y .

E[AY,τm∞ ] = lim

t→∞E[AY,τm

t ]

= limt→∞E[M Y,τm

t ]−E[Y τmt ]

= Y0 − limt→∞E[Y τm

t ]

≤ Y0 ∈ L1

and the same argument works for AZ .

Page 57: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.5. Quasimartingales, compensators, and the fundamental theorem of local martingales 57

3.5.4 Corollary (Existence of the compensator). Let A be a process locally ofintegrable variation. There is a unique finite variation process Ap ∈ P such thatA− Ap is a local martingale.

Remark. Ap is called the compensator of A. When A = [X , Y ] then Ap =: ⟨X , Y ⟩,the predictable variation between X and Y .

PROOF: A is locally a quasimartingale because it is locally of finite variation.

3.5.5 Examples.(i) If N is a Poisson of process of rate λ then N p

t = λt. We know what N − [N]is a local martingale, but in this case [N] = N . What is meaningful is that([N]t −λt)t≥0 is a local martingale.

(ii) Let At := 1t≥τ = 1[τ,∞)(t), where τ is a random time. Let F be the minimalfiltration that satisfies the usual conditions and makes τ a stopping time. (Fmay be realized as follows. Let F 0

t := σ(τ ≤ s : s ≤ t) ∨ N and take

Ft :=⋂

ε>0F0t+ε.) The F-compensator of A is Ap

t =∫ t∧τ

0dF(u)

1−F(u−) , whereF is the cumulative distribution function of τ on [0,∞]. Note that if F iscontinuous then so is Ap.

3.5.6 Theorem (Existence of the predictable projection). Let X be a boundedprocess that is F ⊗B measurable. There exists a unique process pX such that

(i) pX ∈ P(ii) (pX )T = E[XT |FT−] on T <∞ for all predictable stopping times T .

3.5.7 Theorem.(i) If X is a local martingale then pX = X− and p(∆X ) = 0.

(ii) If A is locally of integrable variation and càdlàg then p(∆A) = ∆Ap.

PROOF (OF (II)): By the definition of the compensator, A−Ap is a local martingale.By (i), p(∆(A− Ap)) = 0, so p(∆A) = p(∆Ap). Since Ap is itself predictable, so is∆Ap, and hence p(∆Ap) = ∆Ap.

The following proof is based on the presentation in Jacod and Shiryaev. Seethe textbook for a different presentation.

PROOF (OF THE FUNDAMENTAL THEOREM OF LOCAL MARTINGALES):Assume without loss of generality that M0 = 0. By localization we may assumethat M is a u.i. martingale (this requires work, but we have done it previously).Let b = β/2 and define At :=

s≤t∆Ms1|∆Ms |≥b. The process A is well-definedand of finite variation since M is càdlàg. Let Tn := inft : |A|t > n or |Mt | > n sothat |A|Tn

= |A|Tn− + |∆ATn| ≤ n+ |∆ATn

| and |∆ATn| ≤ |∆MTn

| ≤ n+ |MTn|. Hence

|A|Tn≤ 2n+ |MTn

| ∈ L1 because M is u.i., so therefore |A| is locally integrable andAp exists. Let B := A− Ap and N := M − B. Clearly B is of finite variation, andit is a local martingale by definition of Ap, and N is a local martingale because

Page 58: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

58 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

it is a difference of local martingales. Define X t := ∆Mt1|∆Mt |<b, so that ∆A =∆M − X . Then p(∆A) = p(∆M)− pX = −pX by (i) of Theorem 3.5.7. Therefore∆N = ∆M −∆A+∆Ap = X − pX , so |∆N | ≤ |X |+ |pX | ≤ 2b, i.e. the jumps of Nare bounded by 2b = β .

3.5.8 Corollary.(i) A classical semimartingale is a semimartingale.

(ii) A càdlàg sub- (super-) martingale is a semimartingale.

PROOF: Cf. Theorems 3.1.4, 2.3.1, 2.3.2 and the Doob-Meyer decomposition.

3.5.9 Theorem. If H ∈ L and M is a local martingale then H · M is well-definedand is a local martingale.

Remark. This theorem is not true if H /∈ L. (Example?) This is an importantdifference between the general theory of stochastic integration and the theory ofstochastic integration with respect to continuous local martingales. In the latterthe integral of any predictable process is a local martingale.

PROOF: Write M = N + B as in the fundamental theorem of local martingales. Bylocalization we may assume that H is bounded, M is u.i. and has bounded jumps,and that B has integrable variation (cf. lemma in the proof of Theorem 3.4.8). Weknow by Theorem 3.4.8 that H ·N is a locally square integrable martingale. Usingthe dominated convergence theorem one can prove that H · B is a martingale (use“Riemann sums”).

3.6 Special semimartingales and another decompositiontheorem for local martingales

3.6.1 Definition. A semimartingale X is said to be a special semimartingale if itcan be written X = X0 +M + A where M0 = A0 = 0, M is a local martingale, andA is predictable and of finite variation.

3.6.2 Examples.(i) Quasimartingales (including sub- and super-martingales).

(ii) Local martingales(iii) Continuous semimartingales (proved below).

3.6.3 Theorem. If X is a special semimartingale then the decomposition in thedefinition is unique. This decomposition is known as the canonical decompositionof X .

3.6.4 Theorem. The following are equivalent for a semimartingale X .(i) X is a special semimartingale.

Page 59: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.6. Special semimartingales and another decomposition theorem for local martingales 59

(ii) X ∗, the running maximum of X , is locally integrable.(iii) J := (∆X )∗ is locally integrable.(iv) Decomposing X as X = M+A, where M is a local martingale and A has finite

variation, implies that A is locally of integrable variation.(v) There exists a local martingale M and A locally of integrable variation such

that X = M + A.

PROOF:(I) IMPLIES (II) Recall the following fact. If A is càdlàg, predictable, and of finite

variation then A is locally of integrable variation. (This is an exercise on thehomework.) Recall also that if M is a local martingale then M∗t = sups≤t |Ms|is locally integrable. (Indeed, we may assume that M is u.i. Let Tn := inft :M∗t > n and note that (M∗)Tn ≤ n ∨ |MTn

|, which is integrable.) If X is aspecial semimartingale then write X = M + A as in the definition and notethat X ∗ ≤ M∗ + A∗ ≤ M∗ + |A|, so X ∗t is locally integrable for all t.

(II) IMPLIES (III) For all s ≤ t, |∆Xs|= |Xs − Xs−| ≤ 2X ∗t , which implies J ≤ 2X ∗.(III) IMPLIES (IV) Homework.(IV) IMPLIES (V) Trivial.(V) IMPLIES (I) Write X = M + A where A is locally of integrable variation. Then

by Corollary 3.5.4, Ap exists and X = (M + A− Ap) + Ap. Note that the firstsummand is a local martingale and the second, Ap, is predictable and offinite variation.

3.6.5 Theorem. If X is a continuous semimartingale then X may be written X =M + A, where M is a local martingale and A is continuous and of finite variation.In particular, X is a special semimartingale.

PROOF: By Theorem 3.6.4(iii) X is special, so write X = M + A in the unique way.Let’s see that A is continuous. We have A∈ P , so pA= A and p(∆A) = ∆A. But

∆A= p(∆A) = p(∆X )− p(∆M)= p(∆X ) p(∆M) = 0 by Theorem 3.5.7

= 0 X is continuous by assumption.

3.6.6 Definition. Two local martingales M and N are orthogonal if MN is a lo-cal martingale. In this case we write M ⊥ N . A local martingale M is purelydiscontinuous if M0 = 0 and M ⊥ N for all continuous local martingales N .

Remark. If N is a Poisson process of rate λ then Nt := Nt −λt is purely discontin-uous. However, N is not the sum of its jumps, i.e. not locally constant pathwise.

3.6.7 Theorem. Let H2 be the space of L2 martingales.(i) For all M ∈H2, E[[M]∞]<∞ and ⟨M⟩ exists.

Page 60: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

60 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

(ii) (M , N)H2 := E[⟨M , N⟩∞] + E[M0N0] defines an inner product on H2, andH2 is a Hilbert space with this inner product.

(iii) M ⊥ N if and only if ⟨M , N⟩ ≡ 0, which happens if and only if M T ⊥ N −N0for all stopping times T .

(iv) Take H2,c to be the collection of martingales in H2 with continuous pathsand H2,d to be the collection of purely discontinuous martingales in H2.Then H2,c is closed in H2 and (H2,c)⊥ = H2,d . In particular, H2 = H2,c ⊕H2,d .

3.6.8 Theorem. Any local martingale M has a unique decomposition M = M0 +M c+M d , where M c is a continuous local martingale, M d is a purely discontinuouslocal martingale, and M c

0 = M d0 = 0.

PROOF (SKETCH): Uniqueness follows from the fact that a continuous local martin-gale orthogonal to itself must be constant. For existence, write M = M0+M ′+M ′′

as in the fundamental theorem of local martingales, where M ′ has bounded jumpsand M ′′ is of finite variation. Then M ′′ is purely discontinuous and M ′ is locallyin H2. But then M ′ = M c + N where M c is a continuous local martingale andN ∈H2,d is purely discontinuous. Hence we may take M d = M ′′ + N .

Suppose that h : R→ R and h(x) = x in a neighbourhood (−b, b) of zero, e.g.h(x) = x1|x |<1. Let X be a semimartingale and notice that ∆Xs−h(∆Xs) 6= 0 onlyif |∆Xs| ≥ b. Define

X (h)t :=∑

0<s≤t

∆Xs − h(∆Xs).

Then X (h) := X − X0 − X (h) has bounded jumps, so it is a special semimartingaleby Theorem 3.6.4. Write X (h) = M(h)+ B(h) as in that theorem, where M(h) is alocal martingale and B(h) is predictable and of finite variation.

3.6.9 Definition. We call (B, C ,ν) the semimartingale characteristics of X associ-ated with h, where

(i) B := B(h)(ii) C := ⟨M(h)c⟩= [M(h)c], cf. Theorem 3.6.8, and note that C doesn’t depend

on h by the uniqueness part of that theorem.(iii) ν is the “compensator” of the random measure µX associated with the jumps

of X .µX (d t, d x) :=

s≤t

1∆Xs 6=0δ(s,∆Xs)(d t, d x)

(Inspired by the Lévy-Khintchine formula and Lévy measures.)

Remark. M(h)c and C do not depend on the function h. We write X c := M(h)c forthe “continuous part” of the semimartingale X . (This does not mean that X − X c

is locally constant.)

Page 61: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.7. Girsanov’s theorem 61

3.6.10 Example. If X is the Lévy process that appears in the Lévy-Khintchine for-mula and h(x) = x1|x |<1 then B(h) = αt (α depends on h), C(h) = σ2 t (σ2 doesnot depend on h), and ν(d t, d x) = ν(d x) d t, where ν is the Lévy measure (whichdepends on h).

3.7 Girsanov’s theorem

Working on the usual setup (Ω,F ,F = (Ft)t≥0,P), we know that if Q P andX is a P local martingale then X is a Q semimartingale (cf. Theorem 2.2.1). Bythe Bichteler-Dellacherie theorem X = M + A= N + B where A and B have finitevariation, M is a P local martingale, and N is a Q local martingale. The questionis whether, knowing M and A, we can describe N and B.

3.7.1 Theorem. If Q |Ft P |Ft

for all t (we write Qloc P in this case) and M isan adapted càdlàg process then the following hold.

(i) There exists a unique Z ≥ 0 such that Zt =dQ |Ft

d P |Ftfor all t.

(ii) M Z is a P martingale if and only if M is a Q martingale.(iii) If M Z is a P local martingale then M is a Q local martingale.(iv) If M is a Q local martingale with localizing sequence (Tn)n≥1 such thatP[limn→∞ Tn =∞] = 1 then M Z is a P local martingale.

Remark. A property being “P-local” subtly depends on the measure P via the re-quirement that the localizing sequence of stopping times (Tn)n≥1 satisfies Tn ↑ ∞P-a.s. If Q∼loc P then a property holds P-locally if and only if it holds Q-locally.

3.7.2 Corollary. If Q∼loc P then M Z is a P local martingale if and only if M is aQ local martingale.

3.7.3 Theorem (Girsanov-Meyer). Suppose that Q∼loc P and let Z be as in The-orem 3.7.1. If M is a P local martingale then M −

1Z

d[Z , M] is a Q local martin-gale.

PROOF: By integration-by-parts, Z M−[Z , M] = Z− ·M+M− ·Z . By Theorem 3.5.9,Z M − [Z , M] is a P local martingale. By Corollary 3.7.2 M − 1

Z[Z , M] is a Q local

martingale. By integration-by-parts,

1

Z[Z , M] =

1

Z−· [Z , M] + [Z , M]− ·

1

Z+ [ 1

Z, [Z , M]]

By Theorem 3.5.9 the middle term is a Q local martingale. We also have

1

Z−· [Z , M] + [ 1

Z, [Z , M]] =

1

Z−· [Z , M] +

∆1

Z∆[Z , M] =

1

Z· [Z , M]

Page 62: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

62 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

where this last integral is a Lebesgue-Stieltjes integral, since [Z , M] has finitevariation. Thus 1

Z[Z , M] is a Q local martingale plus 1

Z· [Z , M]. Finally,

M −1

Z· [Z , M] = M −

1

Z[Z , M]

︸ ︷︷ ︸

Q local mtg

+1

Z[Z , M]−

1

Z· [Z , M]

︸ ︷︷ ︸

Q local mtg

Remark. M = (M − 1Z· [Z , M]) + 1

Z· [Z , M] is a decomposition showing that M

is a Q-semimartingale. If M is a special semimartingale then this might not bethe canonical decomposition because it may not be the case that 1

Z· [Z , M] is

predictable.

3.7.4 Theorem (Girsanov’s theorem, predictable version).Suppose that Qloc P and let Z be as in Theorem 3.7.1. If M is a P local martin-gale and [M , Z] is P locally of integrable variation and ⟨M , Z⟩ is the P-compensatorof [M , Z] then M ′ = M − 1

Z· ⟨Z , M⟩ is Q-a.s. well-defined and is a Q local martin-

gale. Moreover, [M c] is a version of [(M ′)c].

PROOF: Let Tn := inft : Zt <1n. Then A = 1

Z−⟨Z , M⟩ is well-defined on [0, Tn].

Let T = limn→∞ Tn.

Q[T <∞]≤Q[⋂

n

Tn <∞]

= limn→∞Q[Tn <∞]

= limn→∞EP[ZTn

1Tn<∞]

≤ limn→∞

1

n= 0

(Below all of the arguments are on [0, Tn].) By integration-by-parts

M Z = M− · Z + Z− ·M + [Z , M]= M− · Z + Z− ·M︸ ︷︷ ︸

P local mtg (Thm 3.5.9)

+[Z , M]− ⟨Z , M⟩︸ ︷︷ ︸

P local mtg

+⟨Z , M⟩

Hence Z M − ⟨Z , M⟩ is a P-local martingale. Now, A is predictable and of finitevariation.

AZ = A− · Z + Z− · A+ [Z , A] = A · Z + Z− · A(Note that A · Z is well-defined and, by results we will see in Chapter 4, a P localmartingale since A is locally bounded). But Z− · A = ⟨Z , M⟩, so AZ − ⟨Z , M⟩ is aP local martingale. Hence Z M − ZA= Z M − ⟨Z , M⟩ − (ZA− ⟨Z , M⟩) is a P localmartingale. By Theorem 3.7.1 M ′ = M − A is a Q local martingale.

To see that [M c] is a version of [(M ′)c] on uses the fact that, for a local mar-tingale N , [N c] = [N]c , which is to be proved as an exercise. (Assume that[N d] =

(∆N)2 when N0 = 0.)

Page 63: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

3.7. Girsanov’s theorem 63

Remark. If Qloc P and X is a P-semimartingale then X is a Q-semimartingale.Indeed, if X = M + A is a decomposition under P then M = M ′ + M ′′ whereM ′ has bounded jumps and M ′′ is of finite variation, so X is M ′ plus a finitevariation process. You can prove that [Z , M ′] is locally of integrable variation. Bythe previous theorem, X is M ′ − 1

Z−⟨Z , M ′⟩ plus a finite variation process, so is a

Q-semimartingale.

3.7.5 Theorem. Suppose that Qloc P and let Z be the density process dQd P . Let

R := inft : Zt = 0, Zt− = 0, X be a P-local martingale, and Ut :=∆XR1t≥R. Then

X t −∫ t

01Zs

d[X , Z]s + U pt is a Q-local martingale.

How to use Girsanov’s Theorem

Suppose that X is a P-semimartingale and X = X0 + M + A is a decompositionwith A = J · ⟨M⟩ where J ∈ L (there may or may not be such a decomposition).Define dQ

d P = E (−J · M)− Z , where dZ = −J Z−dM . If the density process is aQ-martingale then

M −∫

1

Z−d⟨Z , M⟩ is a Q-local martingale

d[Z , M] =−J Z−d[M]d⟨Z , M⟩=−J Z−d⟨M⟩

M +

Jd⟨M⟩= X − X0 is a Q-local martingale

3.7.6 Theorem. Let M be a local martingale(KAZAMAKI’S CRITERION). Let S be the collection of bounded stopping times. If M

is continuous andsupT∈SE[exp( 1

2MT )]<∞

then E (M) is a u.i. martingale.(NOVIKOV’S CRITERION). If M is continuous and E[exp( 1

2[M]∞)] <∞ then E (M)

is a u.i. martingale.(LÉPINGLE-MÉMIN). If ∆M >−1 and

At := 12⟨M c⟩+

s≤t

((1+∆Ms) log(1+∆Ms)−∆Ms)

has a compensator Ap such that E[exp(Ap∞)] <∞ then E (M) is a u.i. mar-

tingale.(PROTTER-SHIMBO). If M is locally square integrable such that ∆M >−1 and

E[exp( 12⟨M c⟩∞ + ⟨M d⟩∞)]<∞

then E (M) is a u.i. martingale.

Page 64: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

64 The Bichteler-Dellacherie Theorem and its connexions to arbitrage

3.7.7 Exercise. Why must the supremum in Kazamaki’s criterion be taken overall bounded stopping times and not just deterministic times?

Page 65: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Chapter 4

General stochastic integration

We know how to integrate elements of L with respect to semimartingales. How-ever, this set of integrands is not rich enough to prove “martingale representation”theorems nor many formulae regarding semimartingale local time.

4.1 Stochastic integrals with respect to predictable processes

For this section we will assume that all semimartingales that appear are null atzero.

4.1.1 Definition. Let X be a special semimartingale with canonical decompositionX = M + A. The H2-norm of X is ‖X‖H2 := ‖[M]1/2∞ ‖L2 + ‖|A|∞‖L2 . Define H2 :=X : X is a special semimartingale with ‖X‖H2 <∞

Remark. This extends the definition given in the previous chapter. A local martin-gale M is in H2 if and only if M is an L2-martingale.

4.1.2 Theorem. (H2,‖ · ‖H2) is a Banach space.

PROOF (SKETCH): If X n = M n + An is Cauchy in H2 then M n∞ is Cauchy in L2 and

so the sequence has a limit M . As an exercise, read the rest of this proof in thetextbook.

4.1.3 Lemma. If H ∈ L and H is bounded and X ∈H2 then H · X ∈H2.

PROOF: Suppose that X = M+A is the canonical decomposition. By Theorem 3.5.9,H · M is a local martingale. To see that H · A is predictable, write it as a limit ofRiemann sums. It is clearly of finite variation. Also notice

‖H · X‖H2 ≤ ‖H‖u‖X‖H2 <∞.

65

Page 66: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

66 General stochastic integration

It can be seen that

‖H · X‖H2 = ‖(H2 · [M])1/2∞ ‖L2 + ‖|H · A|∞‖L2 .

The integrals on the right hand side are well-defined for any bounded process.

Notation. If F is a σ-algebra then bF denotes the collection of bounded F -measurable functions. Let bL denote the collection of bounded processes in L.

4.1.4 Definition. Let X = M + A∈H2 and H, J ∈ bP . Define

dX (H, J) := ‖((H − J)2 · [M])1/2∞ ‖L2 + ‖|(H − J) · A|∞‖L2 .

Remark. dX is not a metric, but it is a semi-metric.

4.1.5 Theorem. If X ∈H2 then(i) bL is dense in (bP , dX )

(ii) If (Hn)n≥1 ⊆ bL is Cauchy in (bP , dX ) then (Hn · X ) is Cauchy in H2.(iii) If dX (Hn, H) → 0 and dX (Jn, H) → 0 with Hn, Jn ∈ bL and H ∈ bP then

limn→∞ Hn · X = limn→∞ Jn · X in H2.

Remark. Theorem 4.1.5(ii) and (iii) can be more succinctly stated as follows. Themapping (bL, dX )→ (H2,‖ · ‖H2) that sends H 7→ H · X is an isometry and can beextended to (bP , dX ).

PROOF (OF (I)): Let M := bL and

H := H ∈ bP : for all ε > 0 there is J ∈ bL such that dX (H, J)< ε.

Then M is a multiplicative class and H is a monotone vector space. Indeed, H

is a vector space that contains constants (exercise). Suppose that (Hn) ⊆ H andHn ↑ H pointwise and H is bounded. By the dominated convergence theoremdX (Hn, H)→ 0. Let ε > 0 and pick n0 such that dX (Hn0 , H)< ε/2 and pick J ∈ bLsuch that dX (Hn0 , J) < ε/2. Then dX (H, J) < ε, so H ∈ H. Therefore H is amonotone vector space. By the monotone class theorem bP = bσ(M)⊆H.

4.1.6 Definition. Let X ∈ H2 and H ∈ bP . Suppose that (Hn)n≥1 ⊆ bL is suchthat dX (Hn, H) → 0 (there is such a sequence by Theorem 4.1.5(i)). Define thestochastic integral of H with respect to X to be

H · X :=H2-limn→∞

(Hn · X ),

The limit exists by Theorem 4.1.5(ii) and H ·X is well-defined (i.e. does not dependon the choice of sequence) by Theorem 4.1.5(iii).

4.1.7 Theorem. If X ∈H2 then E[(X ∗∞)2] = E[supt≥0 |X t |2]≤ 8‖X‖2

H2 .

Page 67: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

4.1. Stochastic integrals with respect to predictable processes 67

PROOF: Suppose that X = M + A. Then X ∗∞ ≤ M∗∞ + |A|∞. By Doob’s maximalinequality, E[(M∗∞)

2]≤ 4E[M2∞] = 4E[[M]∞]. Whence

(X ∗∞)2 ≤ 2(M∗∞)

2 + 2(|A|∞)2

E[(X ∗∞)2]≤ 2E[(M∗∞)

2] + 2E[(|A|∞)2]

≤ 8E[[M]∞] + 2E[(|A|∞)2]

≤ 8‖X‖2H2

4.1.8 Corollary. If X n → X in H2 then there is a subsequence (X nk)k≥1 such that(X nk − X )∗∞→ 0 a.s.

4.1.9 Theorem (Properties of the stochastic integral).Let X , Y ∈H2 and H, K ∈ bP .

(i) (H + K) · X = H · X + K · X(ii) H · (X + Y ) = H · X +H · Y

(iii) If T is a stopping time then (H · X )T = (H1[0,T]) · X = H · (X T ).(iv) ∆(H · X ) = H∆X(v) If T is a stopping time then H · (X T−) = (H · X )T−.

(vi) If X has finite variation then H · X coincides with the Lebesgue-Stieltjes in-tegral.

(vii) H · (K · X ) = (HK) · X(viii) If X is a local martingale then H · X is an L2-martingale.(ix) [H · X , K · Y ] = (HK) · [X , Y ]

4.1.10 Exercise. For (ii), show that you can take the same approximating se-quence for dX and dY .

PROOF (SKETCH): For (iv), suppose that (Hn)n≥1 ⊆ bL are such that dX (Hn, H)→0 and (Hn·X−H ·X )∗∞→ 0 a.s. Then limn→∞ Hn∆X = limn→∞∆(Hn·X ) = ∆(H ·X ).It follows that

limn→∞

Hn1∆X 6=0 =∆(H · X )∆X

1∆X 6=0 a.s.

Show that this implies that limn→∞ Hn∆X = H∆X .For (ix), it is enough to show that [H · X , Y ] = H · [X , Y ]. Again suppose

(Hn)n≥1 ⊆ bL are such that dX (Hn, H)→ 0 and (Hn · X − H · X )∗∞ → 0 a.s. SinceY− is locally bounded we can assume that it is bounded.

[Hn · X , Y ] = Hn · [X , Y ]→ H · [X , Y ][Hn · X , Y ] = (Hn · X )Y − (Hn · X )− · Y − Y− · (Hn · X )

→ (H · X )Y − (H · X )− · Y − Y− · (H · X )

(This proof may require some further explanation.)

Page 68: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

68 General stochastic integration

4.1.11 Definition. A property (P) holds prelocally for X if there exist a sequence(Tn)n≥1 of stopping times with Tn ↑ ∞ such that (P) holds for X Tn− = X1[0,Tn) +XTn−1[Tn,∞) (with the convention that∞−=∞).

4.1.12 Theorem. Let X be a semimartingale with X0 = 0. Then X is prelocally anelement of H2.

PROOF: Note that X is not necessarily as special semimartingale. Write X = M +A where M is a local martingale and A is of finite variation (cf. the Bichteler-Dellacherie Theorem). Let Tn := inft : |Mt |> n or |A|t > n and define Y = X Tn−.

Y ∗t = (XTn−)∗t ≤ (M

Tn−)∗t + (ATn−)∗t ≤ 2n

It follows that Y is a special semimartingale. We may assume, by further local-ization, that [Y ] is bounded. (Indeed, take Rn = inft : [Y ]t > n and note that[Y Rn]∗ ≤ n+(∆YRn

)∗ ≤ 3n.) Suppose that Y = N+B where N is a local martingaleand B is predictable. We have seen that |B| is locally bounded (homework), so wemay assume that B ∈ H2 by localizing once more. Again from the homework,E[[Y ]∞] = E[[N]∞] +E[[B]∞], so E[[N]∞]<∞ and hence N ∈H2 as well.

4.1.13 Definition. Let X ∈H2 with canonical decomposition X = M + A. We saythat H ∈ P is (H2, X )-integrable if

E∫ ∞

0

H2s d[M]s

+E∫ ∞

0

|Hs|d|A|s

2

<∞.

4.1.14 Theorem. If X ∈ H2 and H ∈ P is (H2, X )-integrable then the sequence((H1|H|≤n) · X )n≥1 is a Cauchy sequence in H2.

PROOF: Let Hn := H1|H|≤n. Clearly Hn ∈ bP , so Hn · X is well-defined.

‖Hn · X −Hm · X‖H2 = dX (Hn, Hm)

= ‖((Hn −Hm)2 · [M])1/2‖L2 + ‖|(Hn −Hm)2 · A|∞‖L2

→ 0 by the dominated convergence theorem.

4.1.15 Definition. If X ∈ H2 and H ∈ P is (H2, X )-integrable then define H · Xto be the limit in H2 of ((H1|H|≤n) · X )n≥1, which exists by Theorem 4.1.14. Yetotherwise said,

H · X :=H2-limn→∞

(H1|H|≤n) · X .

For any semimartingale X and H ∈ P , we say that H ·X exists if there is a sequenceof stopping times (Tn)n≥1 that witnesses X is prelocally in H2 and such that H is(H2, X Tn−)-integrable for all n. In this case we define H · X to be H · (X Tn−) on[0, Tn) and we say that H ∈ L(X ), the set of X -integrable processes.

Page 69: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

4.1. Stochastic integrals with respect to predictable processes 69

Remark.

(i) H · X is well-defined because H · (X Tm−) = (H · X Tn−)Tm− for n> m (cf. Theo-rem 4.1.9). By similar reasoning, H ·X does not depend on the prelocalizingsequence chosen.

(ii) If H ∈ (bP )loc then H ∈ L(X ) for any semimartingale X .(iii) Notice that if H ∈ L(X ) then H ·X is a semimartingale. (Prove as an exercise

that if C is the collection of semimartingales then C= Cloc = Cpreloc.)(iv) Warning: We have not defined H · X = H ·M + H · A when X = M + A. If it Is it true?

is true we will need to prove it as a theorem.

4.1.16 Theorem. Let X and Y be semimartingales.(i) L(X ) is a vector space and Λ : L(X )→ special semimartingales : H 7→ H ·X

is a linear map.(ii) ∆(H · X ) = H∆X for H ∈ L(X ).

(iii) (H · X )T = (H1[0,T]) · X = H · (X T ) for H ∈ L(X ) and stopping times T .

(iv) If X is of finite variation and∫ t

0|Hs|d|X |s <∞ then H · X coincides with the

Lebesgue-Stieltjes integral for all H ∈ L(X ).(v) Let H, K ∈ P . If K ∈ L(X ) then H ∈ L(K · X ) if and only if HK ∈ L(X ), and

in this case (HK) · X = H · (K · X ).(vi) [H · X , K · Y ] = (HK) · [X , Y ] for H ∈ L(X ) and K ∈ L(Y ).

(vii) X ∈H2 if and only if sup‖H‖u≤1 ‖(H · X )∗∞‖L2 <∞. In this case,

‖X‖H2 ≤ sup‖H‖u≤1

‖(H · X )∗∞‖L2 + 2‖[X ]1/2∞ ‖L2 ≤ 5‖X‖H2

and‖X‖H2 ≤ 3 sup

‖H‖u≤1‖(H · X )∗∞‖L2 ≤ 9‖X‖H2 .

PROOF (OF (VII)): To be added.

4.1.17 Exercise. Suppose that H ∈ P and X is of finite variation and also that∫ t

0|Hs|d|X |s < ∞ a.s. for all t (this is the Lebesgue-Stieltjes integral). Is it true

that H ∈ L(X )? Note that this is not the same statement as Theorem 4.1.16(iv).

4.1.18 Theorem. Suppose that X is a semimartingale and H ∈ L(X ) under P. IfQ P then H ∈ L(X ) under Q and H ·P X = H ·Q X .

PROOF (SKETCH): Let (Tn)n≥1 witness that H ∈ L(X ) under P, so that Tn ↑ ∞ P-a.s.and H is (H2, X Tn−)-integrable for all n. Let Zt := EP[ dQ

d P |Ft], Sn := inft : |Zt | >n and Rn := Tn ∧ Sn. Since |ZRn−| ≤ n one can show that

‖X‖H2(Q) ≤ sup‖H‖u≤1

‖(H · X )∗∞‖L2(Q) + 2EQ[[X Rn−]1/2∞ ]

= sup‖H‖u≤1

EP[ZRn((H · X )∗∞)

2]1/2 + 2EP[ZRn[X Rn−]1/2∞ ]

Page 70: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

70 General stochastic integration

≤ 5p

n‖X‖H2(P)

(The last inequality is an exercise. There was a hint but I missed it.) This impliesthat X ∈H2(Q). The proof that H is (H2, X Rn−)-integrable under Q is similar.

4.1.19 Theorem. Let M be a local martingale and H ∈ P be locally bounded.Then H ∈ L(M) and H ·M is a local martingale.

PROOF: We need the following lemma.

4.1.20 Lemma. Suppose that X = M + A, where M ∈ H2 and A is of integrablevariation. If H ∈ P then there is a sequence (Hn)n≥1 ∈ bL such that Hn·M → H ·Min H2 and E[|Hn −H| · |A|∞]→ 0.

PROOF: Let δX (H, J) := E[(H− J)2 · [M]∞]1/2+E[|H− J | · |A|∞] for all H, J ∈ P .Let H = H ∈ P : for all ε > 0 there is J ∈ bL such that δX (H, J)< ε. One cansee that H is a monotone vector space that contains bL. It follows that bP ⊆H.

By localization we may assume that H ∈ bP and M = M ′ + A, where M ′ ∈ H2

and A is of integrable variation (this follows from the fundamental theorem of localmartingales and a problem from the exam). By the lemma there is (Hn)n≥1 ∈ bLsuch that δX (Hn, H)→ 0. We know that Hn ·M = Hn ·M ′ +Hn · A is a martingalefor each n. Finally, Hn ·M ′→ H ·M ′ in H2 and E[|Hn −H| · |A|∞]→ 0 imply thatH ·M ′ and H · A are martingales.

4.1.21 Theorem (Ansel-Stricker). Let M be a local martingale and H ∈ L(M).Then H · M is a local martingale if and only if there stopping times Tn ↑ ∞ and(θn)n≥1 nonpositive integrable random variables such that (H∆M)Tn

t ≥ θn for allt and all n.

Remark. If H is locally bounded and Tn is such that M∗Tnis integrable and HTn is

bounded by n then one can take θn :=−2nM∗Tnin the theorem to show that H ·M

is a local martingale.

PROOF: If H ·M is a local martingale then there is are Tn ↑ ∞ such that (H ·M)∗Tn

is integrable, and one may take θn := −2(H · M)∗Tnto see that the condition is

necessary. Proof of sufficiency may be found in the textbook.

4.1.22 Theorem. Let X = M + A be a special semimartingale and H ∈ L(X ). IfH ·X is a special semimartingale then H ∈ L(M)∩ L(A), H ·M is a local martingale,and H ·A is predictable and of finite variation. In this case H · X = H ·M +H ·A isthe canonical decomposition of H · X .

4.1.23 Theorem (Dominated convergence). Let X be a semimartingale, G ∈L(X ), Hm ∈ P such that |Hm| ≤ G, and assume that Hn → H pointwise. ThenHm, H ∈ L(X ) for all m and Hn · X → H · X u.c.p.

Page 71: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

4.1. Stochastic integrals with respect to predictable processes 71

PROOF: Suppose that Tn ↑ ∞ are such that G is (H2, X Tn−)-integrable for all n. IfX Tn− = M + A is the canonical decomposition then

E[(Hm)2 · [M]∞]1/2 +E[|Hm| · |A|2∞]1/2 ≤ E[G2 · [M]∞]1/2 +E[|G| · |A|2∞]

1/2 <∞

Therefore Hm are all (H2, X Tn−)-integrable and hence so is H by the dominatedconvergence theorem.

For t0 fixed

‖ supt≤t0

|(Hm −H) · X Tn−|‖2L2 ≤ 8‖(Hm −H) · X Tn−‖2

H2 → 0

by the dominated convergence theorem. Whence Hm · X Tn−→ H · X Tn− uniformlyon [0, t0] in probability. Given ε > 0 let n and m be such that P[Tn < t0] < ε andP[((Hm −H) · X Tn−)∗t0

> δ]< ε. Whence P[((Hm −H) · X )∗t0> δ]< 2ε.

4.1.24 Theorem. Let X be a semimartingale. There exists Q ∼ P such that X t ∈H2(Q) for all t and dQ

d P is bounded.

4.1.25 Example (Emery). Suppose that T ∼ Exponential(1), P[U = 1] = P[U =−1] = 1

2, and U and T are independent. Let X := U1t≥T and let the filtration be

the natural filtration of X , so that X is an H2 martingale of finite variation and[X ]t = 1t≥T . Let Ht := 1

t1t>0, which is predictable because it is left continuous.

But H is not (H2, X )-integrable.

E[H2 · [X ]∞] = E

1

T 2

=∞

Is H ∈ L(X )? We do know that H · X exists as a pathwise Lebesgue-Stieltjesintegral, namely (H · X )t =

UT

1t≥T . We have E[|H · X |t] =∞ for all t (because theproblem is around zero and not at ∞). It follows in particular that H · X is not amartingale.

Claim. If S is a finite stopping time with P[S > 0]> 0 then E[|(H · X )S |] =∞.

Indeed, we have |H · X )S | =1T

1S≥T . The hard part is to show that there is ε > 0such that “T ≤ ε implies S ≥ T”. If this is the case then E[|(H · X )S |] =E[ 1

T1T≤ε] =∞.

(Perhaps use T ≤ ε = T ≤ ε, S < T ∪ T ≤ ε, S ≥ T and 1S<T ∈ FT− =σ(T )∨N .)

Page 72: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Index

(H2, X )-integrable, 68H2-norm, 65S, 24

adapted, 5additive process, 12announcing sequence, 54arbitrage, 44arrival rate, 10

Brownian motion, 11

càdlàg, 5càglàd, 5canonical decomposition, 58class D, 52classical semimartingale, 43closable, 8compensator, 57continuous part of quadratic varia-

tion, 32counting process, 9

début time, 6decomposable process, 26

events before time T , 5events strictly before time T , 5explosion time, 9

Feller process, 13Fisk-Stratonovich integral, 39FLVR, 44FLVR+LI, 44free lunch with vanishing risk, 44

hitting time, 6

indistinguishable, 5infinitely divisible distribution, 18integrable processes, 68integrable variation, 54intensity, 10intrinsic Lévy process, 12

Jacod’s countable expansion theorem,25

Lévy’s theorem, 42Lévy-Khintchine exponent, 18Lévy measure, 16Lévy process, 12little investment, 44local martingale, 20localized class, 20localizing sequence, 20

modification, 5monotone vector space, 6

NA, 51natural process, 54NFLVR, 44NFLVR+LI, 44no arbitrage, 51no free lunch with vanishing risk, 44no unbounded profit with bounded

risk, 51NUPBR, 51

optional σ-algebra, 6optional process, 6orthogonal, 59

Poisson process, 9

72

Page 73: public.econ.duke.edupublic.econ.duke.edu/.../21-882-int_lec.pdf · 2 Contents 2 Semimartingales and stochastic integration24 2.1 Introduction. . . . . . . . . . . . . . . . . . .

Index 73

polarization identity, 31predictable, 54predictable variation, 57prelocally, 68price process, 44process of jumps, 5progressive σ-algebra, 7progressive process, 7progressively measurable, 7purely discontinuous, 59

quadratic pure jump semimartingale,32

quadratic variation, 31quasimartingale, 56

random time, 4

semimartingale, 24semimartingale characteristics, 60simple predictable process, 24spatially homogeneous, 13special semimartingale, 58square integrable martingale, 9stable process, 12stable under stopping, 20standard Brownian motion, 11stochastic integral, 24, 28, 66stochastic integral process, 28stochastic process, 5stopped process, 9stopping time, 4strategy, 44subordinator, 12

total semimartingale, 24total variation process, 21totally inaccessible, 54

u.c.p., 27u.i., 7uniformly integrable, 7uniformly on compacts in probabil-

ity, 27usual conditions, 4

variation, 56