Top Banner
STK4080 SURVIVAL AND EVENT HISTORY ANALYSIS Slides 5: Counting processes and martingales Bo Lindqvist Department of Mathematical Sciences Norwegian University of Science and Technology Trondheim https://www.ntnu.no/ansatte/bo.lindqvist [email protected] University of Oslo, Autumn 2019 Bo Lindqvist Slides 5: Martingales STK4080 1 / 73
73

STK4080 SURVIVAL AND EVENT HISTORY ANALYSIS · 2019. 8. 18. · EXERCISE 1 Throw a die several times. Let Y i be result in ith throw, and let X i = Y 1 + :::+ Y i be the sum of the

Jan 26, 2021

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • STK4080 SURVIVAL AND EVENT HISTORYANALYSIS

    Slides 5: Counting processes and martingales

    Bo LindqvistDepartment of Mathematical Sciences

    Norwegian University of Science and TechnologyTrondheim

    https://www.ntnu.no/ansatte/[email protected]

    University of Oslo, Autumn 2019

    Bo Lindqvist Slides 5: Martingales ()STK4080 1 / 73

  • STOCHASTIC PROCESSES; COUNTING PROCESSES

    A stochastic process is a collection of random variables, {X (t) : t ∈ T }Index set T is usually time, e.g.

    T = [0,∞) (“continuous time”) or

    T = {0, 1, . . .} ≡ N (“discrete time”)

    COUNTING PROCESSES {N(t) : t ≥ 0}:

    N(t)− N(s) = # events in interval (s, t] for s < t

    N(t) ∈ {0, 1, . . .}

    N(t) ↑ t in steps of height 1

    Bo Lindqvist Slides 5: Martingales ()STK4080 2 / 73

  • EXAMPLES

    DISCRETE TIMERandom walk Xn, e.g. Xn = better’s gain after n games

    CONTINUOUS TIMEPoisson process N(t)Wiener process W (t) (Brownian motion)

    Bo Lindqvist Slides 5: Martingales ()STK4080 3 / 73

  • IMPORTANT IN SURVIVAL ANALYSIS

    The dynamic aspects of a process,i.e. given the past - what is most likely to happen in the future? What is theprobability of various outcomes in the future?Dynamic representations of processes are important for:

    Modeling purposes

    Deriving likelihood functions for statistical inference

    Studying behaviour of estimators

    Bo Lindqvist Slides 5: Martingales ()STK4080 4 / 73

  • CONDITIONAL DISTRIBUTIONS GIVEN PAST

    DISCRETE TIME:Distribution of Xn,Xn+1, . . . given X1,X2, . . . ,Xn−1 (or “given Fn−1”)

    CONTINUOUS TIMEDistribution of X (s) for s ≥ t given X (u) for 0 ≤ u < t (or “given Ft−”)

    Bo Lindqvist Slides 5: Martingales ()STK4080 5 / 73

  • BASIC PROBABILITY THEORY

    JOINT DISTRIBUTIONSX1,X2, . . . ,Xn random variables

    f (x1, x2, . . . , xn) = P(X1 = x1, . . . ,Xn = xn) (discrete)

    f (x1, x2, . . . , xn) = lim∆i→0

    P(x1 ≤ X1 ≤ x1 + ∆1, . . . , xn ≤ Xn ≤ xn + ∆n)∆1 · · ·∆n

    (contin.)

    MARGINAL DISTRIBUTIONSFor m < n,

    f (x1, x2, . . . , xm) =

    xm+1,...,xnf (x1, x2, . . . , xm, xm+1, . . . , xn)∫

    · · ·∫

    f (x1, x2, . . . , xm, xm+1, . . . , xn)dxm+1 · · · dxn

    CONDITIONAL DISTRIBUTIONS

    f (xm+1, . . . , xn|x1, x2, . . . , xm) =f (x1, x2, . . . , xm, xm+1, . . . , xn)

    f (x1, x2, . . . , xm)

    Bo Lindqvist Slides 5: Martingales ()STK4080 6 / 73

  • THE RULE OF DOUBLE EXPECTATION

    E (Xn|x1, x2, . . . , xn−1) ≡ g(x1, x2, . . . , xn−1) is a number;E (Xn|X1,X2, . . . ,Xn−1) ≡ g(X1,X2, . . . ,Xn−1) is a random variable;given as a function of X1,X2, . . . ,Xn−1

    As such it has an expected value. What is this value?

    E [E (Xn|X1,X2, . . . ,Xn−1)] = E [g(X1,X2, . . . ,Xn−1)]=

    ∑x1,x2,...,xn−1

    g(x1, x2, . . . , xn−1)f (x1, x2, . . . , xn−1)

    =∑

    x1,x2,...,xn−1

    E (Xn|x1, x2, . . . , xn−1)f (x1, x2, . . . , xn−1)

    =∑

    x1,x2,...,xn−1

    ∑xn

    xnf (x1, x2, . . . , xn)

    f (x1, x2, . . . , xn−1)f (x1, x2, . . . , xn−1)

    =∑

    x1,x2,...,xn−1

    ∑xn

    xnf (x1, x2, . . . , xn)

    =∑xn

    xn∑

    x1,x2,...,xn−1

    f (x1, x2, . . . , xn)

    =∑xn

    xnf (xn) = E (Xn)

    Bo Lindqvist Slides 5: Martingales ()STK4080 7 / 73

  • INTERPRETATION OF “THE RULE OF DOUBLE EXPECTATION”

    E [E (Xn|X1,X2, . . . ,Xn−1)] = E (Xn)

    “If I observe X1,X2, . . . ,Xn−1 many times, and each time compute the expectedvalue E (Xn|X1,X2, . . . ,Xn−1), then on average I will get the value E (Xn)”.

    Bo Lindqvist Slides 5: Martingales ()STK4080 8 / 73

  • EXERCISE 1

    Throw a die several times. Let Yi be result in ith throw, and letXi = Y1 + . . .+ Yi be the sum of the i first throws.

    (a) Find an expression for f (x1, x2).

    (b) Find an expression for f (x2|x1).

    (c) Find an expression for E (X2|x1) as a function of x1.

    (d) Find the expectation of the random variable E (X2|X1). FindE (X2) from this. Find also E (X2) without using the rule of doubleexpectation.

    (e) Find also E (X3|X2), E (X3|X1,X2) and E (X3|X1).

    Bo Lindqvist Slides 5: Martingales ()STK4080 9 / 73

  • A CONDITIONAL DOUBLE EXPECTATION RESULT

    E (X3|X1) = E [E (X3|X1,X2)|X1]

    which if we suppress the conditioning on X1 throughout,is merely the rule of double expectation:

    E (X3) = E [E (X3|X2)]

    Bo Lindqvist Slides 5: Martingales ()STK4080 10 / 73

  • THE HISTORY OF A PROCESS

    Notation:Fn = “information contained in X1,X2, . . . ,Xn”

    or “the history at time n”, or the ’past’ at time n; meaning the set of all eventsthat can be described in terms of X1,X2, . . . ,Xn, (these form a so calledσ-algebra Fn).

    Remark 1: F1 ⊂ F2 ⊂ F3 ⊂ · · ·

    EXAMPLE: The event {X2 > 2X4} is in F4 and F17 but not in F2.

    Remark 2: In applications it may be convenient to let the Fn contain alsoauxiliary (random) information, provided F1 ⊂ F2 ⊂ F3 ⊂ · · · still holds.

    Unless otherwise stated, we assume that Fn is generated from X1,X2, . . . ,Xnalone.

    Can now write E (Xn|X1,X2, . . . ,Xn−1) = E (Xn|Fn−1)

    More generally we can consider E (Y |Fn−1) or E (Y |Fn) for any randomvariable Y which is a function of the full process X1,X2, . . ., i.e. Y ∈ F ,where F is the information in all of X1,X2, . . ., so F1 ⊂ F2 ⊂ · · · ⊂ F

    Bo Lindqvist Slides 5: Martingales ()STK4080 11 / 73

  • CONDITIONAL EXPECTATION GIVEN THE HISTORY

    Let F1 ⊂ F2 ⊂ F3 ⊂ · · · ⊂ F and let Y ∈ F . Then the

    conditional expectation

    E (Y |Fn) (= E (Y |X1,X2, . . . ,Xn))is interpreted as “what can be said about Y after having observed (only)X1,X2, . . . ,Xn, i.e. knowing only the history Fn”

    It is uniquely characterized by

    1 E (Y |Fn) ∈ Fn, i.e. is a function of X1,X2, . . . ,Xn only.2 For any Z ∈ Fn we have

    E [ZE (Y |Fn)] = E [ZY ]

    For intuition:

    Special case of “2” is rule of double expectation:

    E [E (Y |X1,X2, . . . ,Xn)] = E [Y ] (put Z ≡ 1)

    For Z ∈ Fn it holds that E (ZY |Fn) = ZE (Y |Fn),since Z is a ’constant’ with respect to Fn. Take expectations to get “2”

    Bo Lindqvist Slides 5: Martingales ()STK4080 12 / 73

  • PROPERTIES OF CONDITIONAL EXPECTATIONS

    Recall: For Y ∈ F , i.e. a function of all the X1,X2, . . .

    1 E (Y |Fn) ∈ Fn, i.e. is a function of X1,X2, . . . ,Xn only.2 For any Z ∈ Fn we have

    E [ZE (Y |Fn)] = E [ZY ]

    From these we can derive

    (i) E [E (Y |Fn)] = E (Y ) (rule of double expectation)

    (ii) E (aY1 + bY2|Fn) = aE (Y1|Fn) + bE (Y2|Fn) (linearity)

    (iii) If Y ∈ Fn then E (Y |Fn) = Y

    (iv) If Y is independent of Fn i.e. of X1,X2, . . . ,Xn,then E (Y |Fn) = E (Y )

    (v) If Z ∈ Fn then E (ZY |Fn) = ZE (Y |Fn)

    Example: E (XnXn+1|Fn) = XnE (Xn+1|Fn)

    Bo Lindqvist Slides 5: Martingales ()STK4080 13 / 73

  • EXERCISE 2

    Let X1,X2, . . . be independent with E (Xn) = µ for all n.

    Let Sn = X1 + . . .+ Xn.

    Find E (S4|F2) and in general E (Sn|Fm) when m ≤ n

    Bo Lindqvist Slides 5: Martingales ()STK4080 14 / 73

  • GENERALIZATION OF E (X3|X1) = E [E (X3|X1,X2)|X1]

    We have:

    E (Xn|X1, . . . ,Xm1 ) = E [E (Xn|X1, . . . ,Xm2 )|X1, . . . ,Xm1 ] (1)

    for 1 ≤ m1 < m2 < n

    Verbally:

    “If I want to predict, at time m1, what can be said about Xn at time m2 > m1,then this is the same as what can be said about Xn at time m1”.

    Now (1) can be written

    E (Xn|Fm1 ) = E [E (Xn|Fm2 )|Fm1 ] for 1 ≤ m1 < m2 < n

    More generally we have, for any Y ∈ F ,

    E (Y |Fm1 ) = E [E (Y |Fm2 )|Fm1 ] for 1 ≤ m1 < m2

    which can be proved from the two characterizing properties of conditionalexpectations.

    Bo Lindqvist Slides 5: Martingales ()STK4080 15 / 73

  • MARTINGALES IN DISCRETE TIME

    A stochastic process M = {M0,M1,M2, . . .} is called a martingale if

    E (Mn|M1, . . . ,Mn−1) = Mn−1 for n = 1, 2, . . .

    or, more compactly,

    E (Mn|Fn−1) = Mn−1 for n = 1, 2, . . . (2)

    where Fk = information contained in M0,M1, . . . ,Mk .

    Thus - given the history to time n − 1, the expected value at the next step equalsthe current state.

    This can be taken as the definition of a fair game:

    If the gambler’s gain after n − 1 games is Mn−1, then the expected gain after thenext game is the same as the current.

    Bo Lindqvist Slides 5: Martingales ()STK4080 16 / 73

  • MORE GENERAL DEFINITION OF MARTINGALE

    As mentioned earlier, we may let the Fn contain extra information, provided

    F0 ⊂ F1 ⊂ F2 ⊂ · · ·

    Mn ∈ Fn for each n(“the process M is adapted to the history {Fn}”)

    E |Mn|

  • EXPECTED VALUE OF MARTINGALE IS CONSTANT

    Recall definition:E (Mn|Fn−1) = Mn−1

    Taking the expected value on each side, the rule of double expectation gives

    E (Mn) = E (Mn−1)

    HenceE (M0) = E (M1) = · · · = (usually) 0

    (“mean zero martingale”)

    Often it is assumed that M0 = 0.

    Bo Lindqvist Slides 5: Martingales ()STK4080 18 / 73

  • EXERCISE 3

    Let X1,X2, . . . be independent with

    P(Xi = 1) = P(Xi = −1) = 1/2

    Think of Xi as result of a game where one flips a coin and wins 1 unit if “heads”and loses 1 unit if “tails”.

    Let Mn = X1 + . . .+ Xn be the gain after n games, n = 1, 2, . . .

    Show that Mn is a (mean zero) martingale.

    Bo Lindqvist Slides 5: Martingales ()STK4080 19 / 73

  • EXERCISE 4

    (a) Let X1,X2, . . . be independent with E (Xn) = 0 for all n.

    Let Mn = X1 + . . .+ Xn.

    Show that Mn is a (mean zero) martingale.

    (b) Let X1,X2, . . . be independent with E (Xn) = µ for all n.

    Let Sn = X1 + . . .+ Xn.

    Compute E (Sn|Fn−1) (see Exercise 2). Is {Sn} a martingale?Can you find a simple transformation of Sn which is a martingale?

    (c) Let X1,X2, . . . be independent with E (Xn) = 1 for all n.

    Let Mn = X1 · X2 · . . . · Xn.Show that Mn is a martingale. What is E (Mn)?

    Bo Lindqvist Slides 5: Martingales ()STK4080 20 / 73

  • EXERCISE 5

    Show that for a martingale {Mn} is

    (a) E (Mn|Fm) = Mm for all m < n.This is essentially Exercise 2.1 in book.

    (b) E (Mm|Fn) = Mm for all m < n

    (c) Give verbal (intuitive) interpretations of each of (a) and (b).

    Bo Lindqvist Slides 5: Martingales ()STK4080 21 / 73

  • EXERCISE 6

    Define the martingale differences by

    ∆Mn = Mn −Mn−1

    (a) Show that the definition of martingale, E (Mn|Fn−1) = Mn−1,is equivalent to

    E (Mn −Mn−1|Fn−1) = 0, i.e. E (∆Mn|Fn−1) = 0 (3)

    (Hint: Why is E (Mn−1|Fn−1) = Mn−1? )(b) Show that for a martingale we have

    Cov(Mn−1 −Mn−2,Mn −Mn−1) = 0 for all n (4)

    i.e. Cov(∆Mn−1,∆Mn) = 0. Explain in words what this means.

    (c) Show that (3) and (4) automatically hold whenMn = X1 + . . .+ Xn for independent X1,X2, ....

    Note that in this case the differences ∆Mn = Mn −Mn−1 = Xn areindependent.

    Thus (4) shows that martingale differences correspond to aweakening of the independent increments property

    Bo Lindqvist Slides 5: Martingales ()STK4080 22 / 73

  • STOPPING TIMES

    Let M = {M0,M1, . . .} be a process adapted to {Fn}.

    Let T be a positve integer valued random variable.

    T is a stopping time if for all n the event {T = n} is in Fn, i.e. at any time n wecan decide from M0,M1, . . . ,Mn whether we should stop or not!

    Examples of stopping times:

    Let k be a fixed integer and let T = k

    Let A be a set and let T be the first time the process hits A. Example: IfMn is gain after n games (Exercise 3), then we may stop when Mn ≥ 10.

    In survival analysis: T is usually a censoring time, i.e. unit is not observedbeyond T even if still alive.

    Bo Lindqvist Slides 5: Martingales ()STK4080 23 / 73

  • STOPPED PROCESS

    The stopped process is denoted MT and is defined by

    MTn = Mn∧T ≡{

    Mn if n ≤ TMT if n > T

    (so we “freeze” the process at time T ).

    It will be shown later, by using a more general result on transformations ofmartingales (Exercise 2.6 in book), that the stopped process MT is a martingale.

    Bo Lindqvist Slides 5: Martingales ()STK4080 24 / 73

  • TRANSFORMATION OF A MARTINGALE

    Consider again EXERCISE 3: Let X1,X2, . . . be independent withP(Xi = 1) = P(Xi = −1) = 1/2. Xi is result of a game where one flips a coinand wins 1 unit if “heads” and loses 1 unit if “tails”.

    Mn = X1 + . . .+ Xn is the gain after n games, and is a martingale (Exercise 3).

    Suppose now that in the nth game we make the bet Hn, where in order todetermine Hn we use information from the first n − 1 games (i.e. Fn−1). Now weeither lose or win an amount Hn in the nth game. The gain after n games istherefore

    Zn = H1X1 + H2X2 + . . .+ HnXn

    = H1(M1 −M0) + H2(M2 −M1) + . . .+ Hn(Mn −Mn−1)

    where M0 = 0. This is a special case of what is calledthe transformation of M by H and is written Z = H •M.

    The interesting and useful property is that Z = H •M is a martingale wheneverM is. The clue is that Hn ∈ Fn−1 so that H is a so-called predictable process.

    Main result: The transformation of a martingale by a predictable process is againa martingale.

    Bo Lindqvist Slides 5: Martingales ()STK4080 25 / 73

  • PROOF OF MARTINGALE PROPERTY

    LetZn = H1(M1 −M0) + H2(M2 −M1) + . . .+ Hn(Mn −Mn−1)

    Then

    E (Zn − Zn−1|Fn−1) = E (Hn(Mn −Mn−1)|Fn−1)= Hn E (Mn −Mn−1|Fn−1)= 0

    where we use that Hn ∈ Fn−1.

    Conclusion: “You cannot beat a fair game”. But - see next slide...

    Bo Lindqvist Slides 5: Martingales ()STK4080 26 / 73

  • ON MARTINGALES IN WIKIPEDIA

    “Originally, martingale referred to a class of betting strategies that was popular in18th century France. The simplest of these strategies was designed for a game inwhich the gambler wins his stake if a coin comes up heads and loses it if the coincomes up tails. The strategy had the gambler double his bet after every loss, sothat the first win would recover all previous losses plus win a profit equal to theoriginal stake. Since as a gambler’s wealth and available time jointly approachinfinity his probability of eventually flipping heads approaches 1, the martingalebetting strategy was seen as a sure thing by those who practiced it. Of course inreality the exponential growth of the bets would eventually bankrupt those foolishenough to use the martingale for a long time.

    The concept of martingale in probability theory was introduced by Paul PierreLévy, and much of the original development of the theory was done by Joseph LeoDoob. Part of the motivation for that work was to show the impossibility ofsuccessful betting strategies.”

    Bo Lindqvist Slides 5: Martingales ()STK4080 27 / 73

  • MARTINGALE STRATEGY - FIRST WIN IS IN nth GAME

    Game # Stake (H) Result (Lose/Win) Gain (Z )1 1 L -12 2 L -33 4 L -74 8 L -15...

    ......

    ...n − 1 2n−2 L −(2n−1 − 1)

    n 2n−1 W 2n−1 − (2n−1 − 1) = 1n + 1 0 - 1n + 2 0 - 1

    ......

    ......

    This process, stopped at first win (“heads”), is a transformation of the simplebetting process with Xi = ±1, with

    Hn = 2n−1 if X1 = X2 = . . . = Xn−1 = −1

    Hn = 0 if Xi = 1 for some i = 1, 2, . . . , n − 1

    Bo Lindqvist Slides 5: Martingales ()STK4080 28 / 73

  • ROULETTE: PLAY RED/BLACK OR ODD/EVEN

    Bo Lindqvist Slides 5: Martingales ()STK4080 29 / 73

  • PROBLEM WITH MARTINGALE STRATEGY

    EXERCISE 7

    Requires unbounded capital, and requires that bank sets no upper bound for bets- in order to have a sure win!

    EXERCISE 7

    Suppose that you use the martingale betting strategy, but that you in any casemust stop after n games.

    Calculate the expected gain. Can you “beat the game”?

    (Hint: You either win 1 or lose 2n − 1 units.)

    Bo Lindqvist Slides 5: Martingales ()STK4080 30 / 73

  • EXERCISE 8

    Do Exercise 2.6 in book:

    Show that the stopped process MT is a martingale. (Hint: Find a predictableprocess H such that MT = H •M).

    This suggests that E (MT ) ≡ E (MTT ) = E (MT0 ) = E (M0) = 0.

    But this does obviously not hold for the martingale strategy case (where indeedthe process is a martingale, stopped at a stopping time).

    What is the clue here?

    Bo Lindqvist Slides 5: Martingales ()STK4080 31 / 73

  • THE OPTIONAL SAMPLING THEOREM

    state that under certain conditions, the expected value of a martingale at astopping time, is equal to its initial value, i.e. E (MT ) = E (M0).

    Here are some conditions which, together with the condition P(T

  • THE DOOB DECOMPOSITION – EXERCISE 9

    Let X = {X0,X1,X2, . . .} be a process with X0 = 0, adapted to history {Fn}.

    Define process M = {M0,M1,M2, . . .} by M0 = 0 and∆Mn ≡ Mn −Mn−1 = Xn − E (Xn|Fn−1)

    To see that M is a martingale, consider

    E (Mn −Mn−1|Fn−1) = E (Xn − E (Xn|Fn−1)|Fn−1)= E (Xn|Fn−1)− E (Xn|Fn−1) = 0

    The Doob decomposition is now

    Xn = E (Xn|Fn−1) + ∆Mn

    X is thus decomposed in a predictable part {E (Xn|Fn−1)} and a so-calledinnovation part ∆M consisting of martingale differences.

    EXERCISE 9

    Suppose Xn = U1 + . . .+ Un where U1, . . . ,Un are independent with E (Ui ) = µ.

    Find the Doob decomposition of the process X and identify the predictable partand the innovation part.

    Bo Lindqvist Slides 5: Martingales ()STK4080 33 / 73

  • PROCESSES IN CONTINUOUS TIME

    Key elements:

    X = {X (t) : t ∈ [0, τ ]}

    History (information) at time t is given by Ft , with Fs ⊂ Ft when s < t.Typically, Ft corresponds to observation of X (u) for0 ≤ u ≤ t

    {Ft} is called a filtration

    The process X is said to be adapted to {Ft} if X (t) ∈ Ft for all t.

    The process is called cadlag if its paths (trajectories on a graph) are rightcontinuous with left hand limits

    A time T is a stopping time if {T ≤ t} ∈ Ft for each t. This means that attime t we know whether T ≤ t or T > t

    Bo Lindqvist Slides 5: Martingales ()STK4080 34 / 73

  • MARTINGALES

    IN CONTINUOUS TIME IN DISCRETE TIME

    Fs ⊂ Ft whenever s < t F1 ⊂ F2 ⊂ F3 ⊂ · · ·

    M = {M(t) : t ∈ [0, τ ]} M = {Mn : n = 0, 1, . . .}

    Definition: Definition:

    E (M(t)|Fs) = M(s) for all t > s E (Mn|Fm) = Mm for all n > m

    Property: Property:

    Cov(M(t)−M(s),M(v)−M(u)) = 0 Cov(Mk −Mj ,Mn −Mm) = 0

    for 0 ≤ s < t < u < v ≤ τ for 0 ≤ j < k ≤ m < n

    EXERCISE 10

    Prove the results on the covariances (see Exercise 6, and Exercise 2.2 in book)

    Bo Lindqvist Slides 5: Martingales ()STK4080 35 / 73

  • EQUIVALENT DEFINITIONS OF MARTINGALES

    Discrete time:

    E (∆Mn|Fn−1) = E (Mn −Mn−1|Fn−1) = 0

    Continuous time:

    E (dM(t)|Ft−) ≡ E (M((t + dt)−)−M(t−)|Ft−) = 0

    Here we think of Ft− as the history up to, but not including, time t.

    Thus Ft− contains the information in {M(u) : 0 ≤ u < t}

    Note that dM(t) is defined as the increment of M(t) in the time interval[t, t + dt).

    Bo Lindqvist Slides 5: Martingales ()STK4080 36 / 73

  • STOCHASTIC INTEGRALS

    Recall for discrete time: The transformation of M by H, Z = H •M, is definedfor martingale M and predictable process H by

    Zn = H1(M1 −M0) + H2(M2 −M1) + . . .+ Hn(Mn −Mn−1) =n∑

    i=1

    Hi∆Mi

    Continuous time: Need to define predictable process. The process H = {H(t)}is predictable if, informally, the value of H(t) is known immediately before t. Asufficient condition for predictability of H is that it is adapted to Ft and have leftcontinuous sample paths.

    The stochastic integral of a predictable process H with respect to a martingale Mis now,

    I (t) =

    ∫ t0

    H(s)dM(s) =def limn→∞

    n∑i=1

    Hi∆Mi

    where [0, t] is partitioned into n parts of length t/n and

    Hi = H((i − 1)t/n), ∆Mi = M(it/n)−M((i − 1)t/n)

    As for transformations in discrete time: The I (t) are (mean zero) martingales.Bo Lindqvist Slides 5: Martingales ()STK4080 37 / 73

  • THE DOOB-MEYER DECOMPOSITION

    The process X = {X (t) : t ∈ [0, τ ]} is a submartingale if

    E (X (t)|Fs) ≥ X (s) for all t > s

    Every non-decreasing process (e.g. counting process) must be a submartingale,since E{X (t)− X (s)|Fs} ≥ 0 for all t > s]

    Doob-Meyer decomposition for submartingales:

    X (t) = X ∗(t) + M(t) (uniquely)

    X ∗ is a nondecreasing predictable process, called the compensator of X

    M is a mean zero martingale

    Heuristic proof, suggested by discrete time Doob: Define M and X ∗ by

    dX (t) = E (dX (t)|Ft−) + dM(t) = dX ∗(t) + dM(t)

    Then M(t) is a martingale since

    E (dM(t)|Ft−) = E{dX (t)− E (dX (t)|Ft−)|Ft−} = 0

    and X ∗(t) is nondecreasing since by the submartingale-property

    dX ∗(t) ≡ E (dX (t)|Ft−) = E (X ((t + dt)−)− X (t−)|Ft−) ≥ 0

    Bo Lindqvist Slides 5: Martingales ()STK4080 38 / 73

  • POISSON PROCESSES

    N(t) = # events in [0, t]

    Characterizing properties:

    N(t)− N(s) is Poisson-distributed with parameter λ(t − s) for s < t

    N(t) has independent increments, i.e. N(t)− N(s) is independent of Fs fors < t.

    EXERCISE 11

    Show thatM(t) = N(t)− λt

    is a martingale. Identify the compensator of N(t).

    Hint:

    For t > s is E{N(t)|Fs} = N(s) + λ(t − s)

    Bo Lindqvist Slides 5: Martingales ()STK4080 39 / 73

  • VARIATION PROCESSES

    So far we have mostly looked at expected values. In applications we will also beinterested in the variation of the processes:

    BACK TO DISCRETE TIME ....

    The predictable variation process:

    〈M〉n =n∑

    i=1

    E{(Mi −Mi−1)2|Fi−1} =n∑

    i=1

    Var(∆Mi |Fi−1)

    The optional variation process:

    [M]n =n∑

    i=1

    (Mi −Mi−1)2 =n∑

    i=1

    (∆Mi )2

    Bo Lindqvist Slides 5: Martingales ()STK4080 40 / 73

  • EXERCISES

    EXERCISE 12: This is Exercise 2.3 in book, plus a new questions (d). (a) wasdone as our Exercise 3(a)

    Let X1,X2, . . . be independent with E (Xn) = 0 and Var(Xn) = σ2 for all n. Let

    Mn = X0 + X1 + . . .+ Xn where X0 = 0.

    (a) Show that Mn is a (mean zero) martingale.

    (b) Compute 〈M〉n(c) Compute [M]n

    (d) Compute E 〈M〉n, E [M]n and Var(Mn).

    EXERCISE 13: Consider a general discrete time martingale M. Show that:

    (a) M2n − 〈M〉n is a mean zero martingale - Exercise 2.4 in book(b) M2n − [M]n is a mean zero martingale - See book p. 45(c) Use (a) and (b) to prove that

    Var(Mn) = E 〈M〉n = E [M]nWhat is the intuitive essence of this result?

    Bo Lindqvist Slides 5: Martingales ()STK4080 41 / 73

  • THE PREDICTABLE VARIATION OF A TRANSFORMATION

    Recall:

    (H •M)n = H1(M1 −M0) + H2(M2 −M1) + . . .+ Hn(Mn −Mn−1)∆(H •M)n = Hn(Mn −Mn−1) = Hn∆Mn

    〈M〉n =n∑

    i=1

    E{(Mi −Mi−1)2|Fi−1} =n∑

    i=1

    Var(∆Mi |Fi−1)

    From this:

    〈H •M〉n =n∑

    i=1

    Var(∆(H •M)i |Fi−1)

    =n∑

    i=1

    Var(Hi∆Mi |Fi−1)

    =n∑

    i=1

    H2i Var(∆Mi |Fi−1)

    =n∑

    i=1

    H2i ∆ 〈M〉i

    = (H2 • 〈M〉)nso

    〈H •M〉 = H2 • 〈M〉

    Bo Lindqvist Slides 5: Martingales ()STK4080 42 / 73

  • THE OPTIONAL VARIATION OF A TRANSFORMATION

    Recall:

    (H •M)n = H1(M1 −M0) + H2(M2 −M1) + . . .+ Hn(Mn −Mn−1)∆(H •M)n = Hn(Mn −Mn−1) = Hn∆Mn

    [M]n =n∑

    i=1

    (Mi −Mi−1)2 =n∑

    i=1

    (∆Mi )2

    From this:

    [H •M]n =n∑

    i=1

    (∆(H •M)i )2

    =n∑

    i=1

    (Hi∆Mi )2

    =n∑

    i=1

    H2i (∆Mi )2

    =n∑

    i=1

    H2i ∆ [M]i

    = (H2 • [M])nso

    [H •M] = H2 • [M]Bo Lindqvist Slides 5: Martingales ()STK4080 43 / 73

  • RESULTING FORMULAS

    〈H •M〉 = H2 • 〈M〉[H •M] = H2 • [M]

    with similarities to the elementary formula

    Var(aX ) = a2Var(X )

    Bo Lindqvist Slides 5: Martingales ()STK4080 44 / 73

  • VARIATION PROCESSES IN CONTINUOUS TIME

    The discrete predictable variation process

    〈M〉n =n∑

    i=1

    Var(∆Mi |Fi−1) where ∆Mi = Mi −Mi−1

    becomes in continuous time

    〈M〉 (t) = limn→∞

    n∑i=1

    Var(∆Mi |F(i−1)t/n)

    where

    [0, t] is partitioned into n parts of length t/n

    ∆Mi = M(it/n)−M((i − 1)t/n)

    Informally:

    d 〈M〉 (t) = Var(dM(t)|Ft−)

    Bo Lindqvist Slides 5: Martingales ()STK4080 45 / 73

  • VARIATION PROCESSES IN CONTINUOUS TIME

    The discrete optional variation process

    [M]n =n∑

    i=1

    (∆Mi )2 where ∆Mi = Mi −Mi−1

    becomes

    [M] (t) = limn→∞

    n∑i=1

    (∆Mi )2

    where

    [0, t] is partitioned into n parts of length t/n

    ∆Mi = M(it/n)−M((i − 1)t/n)

    For processes of finite variation (as in our applications), we have

    [M] (t) =∑s≤t

    (M(s)−M(s−))2 , i.e. sum of squares of all jumps of M(t)

    Bo Lindqvist Slides 5: Martingales ()STK4080 46 / 73

  • FROM DISCRETE TO CONTINUOUS TIME

    IN CONTINUOUS TIME IN DISCRETE TIME

    M2(t)− 〈M〉 (t) is a mean zero martingale M2n − 〈M〉n is a mean zero martingale

    M2(t)− [M] (t) is a mean zero martingale M2n − [M]n is a mean zero martingale

    Var(M(t)) = E 〈M〉 (t) = E [M] (t) Var(Mn) = E 〈M〉n = E [M]n

    Bo Lindqvist Slides 5: Martingales ()STK4080 47 / 73

  • CHARACTERIZATION OF 〈M〉 IN CONTINUOUS CASE

    First, M2 is a submartingale because

    E (M2(t)|Fs) ≥ (E (M(t)|Fs))2 = M(s)2

    by Jensen’s inequality.

    Since M2 − 〈M〉 is a martingale, i.e. M2 = 〈M〉+ a martingale, it follows byuniqueness of Doob-Meyer decomposition that

    〈M〉 is the compensator of M2

    But we also have that M2 − [M] is a martingale. Why doesn’t it follow from thisthat also [M] is the compensator of M2?

    Bo Lindqvist Slides 5: Martingales ()STK4080 48 / 73

  • VARIATION PROCESSES FOR STOCHASTIC INTEGRALS

    Recall that I (t) =∫ t

    0H(s)dM(s) is a mean zero martingale whenever M is, and is

    similar to the transformations H •M.

    Recall for discrete time:

    〈H •M〉 = H2 • 〈M〉[H •M] = H2 • [M]

    Corresponding formulas for continuous time:〈∫HdM

    〉=

    ∫H2d 〈M〉[∫

    HdM

    ]=

    ∫H2d [M]

    Bo Lindqvist Slides 5: Martingales ()STK4080 49 / 73

  • EXERCISE 14

    Consider again the Poisson process, where we have shown that

    M(t) = N(t)− λt

    is a martingale.

    Prove now that:M2(t)− λt

    is a martingale (Exercise 2.9 in book)

    Then prove that 〈M〉 (t) = λt

    What is [M] (t)?

    Hint:

    For t > s is E (N(t)|Fs) = N(s) + λ(t − s)

    For t > s is E (N2(t)|Fs) = N(s)2 + 2N(s)λ(t − s) + λ(t − s) + (λ(t − s))2

    Bo Lindqvist Slides 5: Martingales ()STK4080 50 / 73

  • GENERAL COUNTING PROCESS

    Recall that N(t) is right continuous, is nondecreasing with jumps of size 1, and isadapted to Ft

    Doob-Meyer decomposition:

    N(t) = Λ(t) + M(t) (5)

    where the compensator Λ is a (unique) nondecreasing, predictable process.(Λ(t) = λt for the Poisson process, i.e. a determinisitic function)

    Assume we can represent Λ(t) =∫ t

    0λ(s)ds.

    Then λ(t) is called the intensity process, while Λ(t) is the cumulative intensityprocess.

    Classical formulation of counting process martingale:

    M(t) = N(t)−∫ t

    0

    λ(s)ds

    Bo Lindqvist Slides 5: Martingales ()STK4080 51 / 73

  • GENERAL COUNTING PROCESS (cont.)

    From

    M(t) = N(t)−∫ t

    0

    λ(s)ds

    we havedM(t) = dN(t)− λ(t)dt

    which implies since λ(t) is predictable and M is a martingale,

    0 = E (dM(t)|Ft−) = E (dN(t)|Ft−)− λ(t)dt

    so that we have the important connection

    λ(t)dt = E (dN(t)|Ft−)

    Further, since steps are of size 1,

    λ(t)dt = E (dN(t)|Ft−) = P(dN(t) = 1|Ft−) = P(event in [t, t + dt)|Ft−)

    Bo Lindqvist Slides 5: Martingales ()STK4080 52 / 73

  • VARIATION PROCESSES OF GENERAL COUNTING PROCESS

    [M] (t) = N(t) = process itself

    since we have seen that for processes of finite variation, we have

    [M] (t) =∑s≤t

    (M(s)−M(s−))2 , i.e. sum of squared jumps of M(t)

    Furthermore,

    d 〈M〉 (t) = Var(dM(t)|Ft−)= Var(dN(t)− λ(t)|Ft−)= Var(dN(t)|Ft−)= λ(t)dt(1− λ(t)dt)≈ λ(t)dt

    Here we used that dN(t) conditional on Ft− is a 0− 1-variable with expectedvalue λ(t)dt.

    Hence

    〈M〉 (t) =∫ t

    0

    λ(s)ds = Λ(t)

    Bo Lindqvist Slides 5: Martingales ()STK4080 53 / 73

  • STOCHASTIC INTEGRAL OF GENERAL COUNTING PROCESS

    Consider a general counting process N(t) and recall the decompositiom

    M(t) = N(t)−∫ t

    0

    λ(s)ds

    Let further T1 < T2 < · · · be the jump times of N(t).

    Then

    I (t) =

    ∫ t0

    H(s)dM(s)

    =

    ∫ t0

    H(s)dN(s)−∫ t

    0

    H(s)λ(s)ds

    =∑Tj≤t

    H(Tj)−∫ t

    0

    H(s)λ(s)ds

    Bo Lindqvist Slides 5: Martingales ()STK4080 54 / 73

  • VARIATION PROCESSES OF STOCHASTIC INTEGRALS OF

    COUNTING PROCESS MARTINGALES

    Recall first general facts: 〈∫HdM

    〉=

    ∫H2d 〈M〉[∫

    HdM

    ]=

    ∫H2d [M]

    and the special facts for counting processes

    〈M〉 (t) =∫ t

    0

    λ(s)ds

    [M] (t) = N(t)

    From this we get the formulas:〈∫HdM

    〉(t) =

    ∫ t0

    H2(s) λ(s)ds[∫HdM

    ](t) =

    ∫ t0

    H2(s) dN(s)

    Bo Lindqvist Slides 5: Martingales ()STK4080 55 / 73

  • EXAMPLE - THE MULTIPLICATIVE INTENSITY MODEL

    Counting process N(t) has intensity function

    λ(t) = α(t)Y (t)

    where α(t) is a deterministic function representing the risk for each unit understudy, and Y (t) is the predictable process counting the number of units which arepresent immediately before t and hence may fail at time t.

    We have

    N(t) =

    ∫ t0

    λ(s)ds + M(t)

    so we can write

    dN(s) = λ(s)ds + dM(s) = α(s)Y (s) + dM(s)

    We want to estimate α(t), but it is easier to estimate its integral

    A(t) =∫ t

    0α(s)ds, which we will do.

    Assume for simplicity that we know that Y (t) > 0 for all t.

    Then - dividing through the equation by Y (t) we get

    1

    Y (s)dN(s) = α(s)ds +

    1

    Y (s)dM(s)

    Bo Lindqvist Slides 5: Martingales ()STK4080 56 / 73

  • THE MULTIPLICATIVE INTENSITY MODEL (cont.)

    1

    Y (s)dN(s) = α(s)ds +

    1

    Y (s)dM(s)

    Integrating we get that∫ t0

    1

    Y (s)dN(s) = A(t) +

    ∫ t0

    1

    Y (s)dM(s)

    The rightmost expression is a mean zero integral (since Y (s) is a predictableprocess), so this suggests the (Nelson-Aalen) estimator

    Â(t) =

    ∫ t0

    1

    Y (s)dN(s) =

    ∑Tj≤t

    1

    Y (Tj)

    Bo Lindqvist Slides 5: Martingales ()STK4080 57 / 73

  • PROPERTIES OF THE ESTIMATOR

    We have

    Â(t)− A(t) =∫ t

    0

    1

    Y (s)dM(s)

    and hence Â(t) is an unbiased estimator (since the right hand side is a mean zeromartingale). Further (see page 54),〈

    Â− A〉

    (t) =

    ∫ t0

    1

    Y 2(s)α(s)Y (s)ds =

    ∫ t0

    1

    Y (s)α(s)ds

    [Â− A

    ](t) =

    ∫ t0

    1

    Y 2(s)dN(s) =

    ∑Tj≤t

    1

    Y 2(Tj)(6)

    We also know from our earlier formulas that

    Var(Â(t)) = Var(Â(t)− A(t)) = E[Â(t)− A(t)

    ]so that (6) is an unbiased estimator of Var(Â(t))

    Bo Lindqvist Slides 5: Martingales ()STK4080 58 / 73

  • ASYMPTOTIC DISTRIBUTION OF NELSON-AALEN ESTIMATOR

    Suppose that our data were based on observation of a large number n units. Weshall let n tend to infinity in the relation

    √n(Â(t)− A(t)) =

    ∫ t0

    √n

    1

    Y (s)dM(s)

    and hence we need to look at the limiting behaviour of the mean zero martingaleon the right hand side.

    This brings us to the need for an asymptotic theory for martingales.

    Bo Lindqvist Slides 5: Martingales ()STK4080 59 / 73

  • WIENER PROCESS

    A Wiener process (Brownian motion) with variance parameter 1 is a stochasticprocess W (t) with values in the real numbers satisfying

    1 W (0) = 0

    2 For any s1 ≤ t1 ≤ s2 ≤ t2 ≤ · · · ≤ sn ≤ tn, the random variablesW (t1)−W (s1), . . . ,W (tn)−W (sn) are independent.

    3 For any s < t, the random variables W (t)−W (s) are normal with expectedvalue 0 and variance (t − s)

    4 The paths are continuous.

    Bo Lindqvist Slides 5: Martingales ()STK4080 60 / 73

  • WIENER PROCESS

    Bo Lindqvist Slides 5: Martingales ()STK4080 61 / 73

  • GAUSSIAN MARTINGALES

    Let V (t) be a strictly increasing continuous function with V (0) = 0 and considerthe stochastic process

    U(t) = W (V (t))

    (i.e. a time transformation of W (t).)

    The process U(t)

    is a mean zero martingale

    has predictable variation process 〈U〉 (t) = V (t)

    (exercise 2.12 in book).

    U is called a Gaussian martingale.

    Bo Lindqvist Slides 5: Martingales ()STK4080 62 / 73

  • REBOLLEDO’S MARTINGALE CONVERGENCE THEOREM

    Let M̃(n)(t) be a sequence of mean zero martingales for t ∈ [0, τ ].

    DefineM̃(n)� (t) =

    ∑s≤t : |M̃(n)(s)−M̃(n)(s−)|≥�

    |M̃(n)(s)− M̃(n)(s−)|

    to be the sum of the jumps larger than � of the martingale M̃(n)(t).

    Then under the conditions:〈M̃(n)(t)

    〉→ V (t) in probability for all t ∈ [0, τ ] as n→∞,

    where V is a strictly increasing continuous function with V (0) = 0.〈M̃

    (n)� (t)

    〉→ 0 in probability for all t ∈ [0, τ ] and all � > 0 as

    n→∞,

    the sequence M̃(n)(t) converges in distribution to the mean zero Gaussianmartingale U given by U(t) = W (V (t)).

    Bo Lindqvist Slides 5: Martingales ()STK4080 63 / 73

  • APPLICATIONS TO COUNTING PROCESS MARTINGALES

    As seen earlier, we consider stochastic integrals with predictable variationprosesses 〈∫

    HdM

    〉(t) =

    ∫ t0

    H2(s) λ(s)ds

    Rebolledo’s conditions can then be written∫ t0

    (H(n)(s))2λ(n)(s)ds → V (t) for all t ∈ [0, τ ]∫ t0

    (H(n)(s))2I{|H(n)(s)| > �}λ(n)(s)ds → 0 for all t ∈ [0, τ ] and all� > 0

    Sufficent conditions for these are, with V (t) =∫ t

    0v(s)ds,

    (H(n)(s))2λ(n)(s)→ v(s) > 0 for all s ∈ [0, τ ]

    H(n)(s)→ 0 for all s ∈ [0, τ ]

    Bo Lindqvist Slides 5: Martingales ()STK4080 64 / 73

  • NELSON-AALEN EXAMPLE

    Recall that√

    n(Â(t)− A(t)) =∫ t

    0

    √n

    Y (s)dM(s)

    so that we have

    H(n)(t) =

    √n

    Y (t)

    Assume that there is a deterministic positive function y(t) such thatY (t)/n→ y(t) in probability. Then the two sufficient conditions for Rebolledo’stheorem are satisfied:

    (H(n)(s))2λ(n)(s) =n

    Y 2(s)· α(s)Y (s) = α(s)

    Y (s)/n→ α(s)

    y(s)≡ v(s)

    H(n)(s) =

    √n

    Y (s)=

    1/√

    n

    Y (s)/n→ 0

    Bo Lindqvist Slides 5: Martingales ()STK4080 65 / 73

  • NELSON-AALEN EXAMPLE (cont.)

    Conclusion:

    √n(Â(t)− A(t)) =

    ∫ t0

    √n

    Y (s)dM(s)

    converges in distribution to a mean zero Gaussian martingale with predictablevariation process

    V (t) =

    ∫ t0

    v(s)ds =

    ∫ t0

    α(s)

    y(s)ds

    In particular, for each t is Â(t) asymptotically normally distributed, and

    Var(Â(t)) ≈ 1n·∫ t

    0

    α(s)

    Y (s)/nds =

    ∫ t0

    α(s)

    Y (s)ds(=

    〈Â− A

    〉(t))

    Bo Lindqvist Slides 5: Martingales ()STK4080 66 / 73

  • THE INNOVATION THEOREM (Section 2.2.7)

    We have defined the intensity of a counting process N with respect to a filtrationFt by

    λF (t) = E (dN(t)|Ft−)We have earlier tacitly assumed that the underlying filtration is given.

    For example, we have often assumed the so-called self-exiting filtration Ntgenerated by the history of N(t) alone.

    There may also be other histories which contain more information, and withrespect to which we get a different intensity.

    Recall the general result, for any Y ∈ F ,

    E (Y |Fm1 ) = E [E (Y |Fm2 )|Fm1 ] for 1 ≤ m1 < m2 < n

    Using this we have, if Ft ⊆ Gt (i.e. G contains more information than F), that

    E (dN(t)|Ft−) = E [E (dN(t)|Gt−)|Ft−]

    so it follows thatλF (t) = E (λG(t)|Ft−)

    This is the innovation theorem.Bo Lindqvist Slides 5: Martingales ()STK4080 67 / 73

  • INDEPENDENT CENSORING (Section 2.2.8 in book)

    Let Nci (t) be counting process for individual i , when censoring is not present(i = 1, 2, . . . , n), with intensity process

    λci (t) = Yci (t)αi (t)

    These processes are adapted to the history {F ct }, say.

    Let censoring be represented as 0− 1-processes Y 0i (t) such that the process Nci isobserved only when Y 0i (t) = 1 (e.g. Y

    0i (t) = I (t ≤ Ci ) for censoring times Ci ).

    Observed process is then Ni (t) =∫ t

    0Y 0i (s)dN

    ci (s) which jumps when N

    ci (t)

    jumps, but only if Y 0i (t) = 1 at the jump.

    Let Gt be the history F ct augmented by the information in all the Y 0i (s) for s ≤ t.

    Independent censoring means by definition that the intensity of the countingprocess Nci (t) is the same for the two histories G and F c , i.e.

    λGi (t) = λFci (t) for all t

    Bo Lindqvist Slides 5: Martingales ()STK4080 68 / 73

  • INDEPENDENT CENSORING (cont.)

    What is the intensity of the observed process Ni (t), with respect to its filtrationFt? (Which is smaller than both F c and G)

    Note that Yi (t) = Y0i (t)Y

    ci (t) is adapted to F (why?), which will be used below:

    λFi (t)dt = E{dNi (t)|Ft−}= E{Y 0i (t)dNci (t)|Ft−}= E{Y 0i (t)E (dNci (t)|Gt−)|Ft−}= E{Y 0i (t)λGi (t)dt|Ft−}= assuming independent censoring

    = E{Y 0i (t)λFc

    i (t)dt|Ft−}= E{Y 0i (t)Y ci (t)αi (t)dt|Ft−}= Yi (t)E{αi (t)|Ft−}dt= Yi (t)αi (t)dt

    where we assume that αi (t) is adapted to Ft−.

    This shows that the form of the intensity is preserved under independentcensoring.Bo Lindqvist Slides 5: Martingales ()STK4080 69 / 73

  • PREDICTABLE COVARIATION PROCESSES

    Recall predictable variation process:

    d 〈M〉 (t) = Var(dM(t)|Ft−)where dM(t) = M((t + dt)−)−M(t−)

    Similarly we can define the predictable covariation process

    d 〈M1,M2〉 (t) = Cov(dM1(t), dM2(t)|Ft−)

    From this

    d 〈M1 + M2〉 (t) = Var(dM1(t) + dM2(t)|Ft−)= Var(dM1(t)|Ft−) + Var(dM2(t)|Ft−)+ 2Cov(dM1(t), dM2(t)|Ft−)= d 〈M1〉 (t) + d 〈M2〉 (t) + 2d 〈M1,M2〉 (t)

    Hence〈M1 + M2〉 = 〈M1〉+ 〈M2〉+ 2 〈M1,M2〉

    Bo Lindqvist Slides 5: Martingales ()STK4080 70 / 73

  • OPTIONAL COVARIATION PROCESSES

    Recall optional variation process:

    [M] (t) = limn→∞

    n∑i=1

    (∆Mi )2

    where ∆Mi = M(it/n)−M((i − 1)t/n)

    Similarly we can define the optional covariation process

    [M1,M2] (t) = limn→∞

    n∑i=1

    ∆M1i∆M2i

    Also:[M1 + M2] = [M1] + [M2] + 2[M1,M2]

    Bo Lindqvist Slides 5: Martingales ()STK4080 71 / 73

  • OPTIONAL COVARIATION PROCESSES (cont.)

    Exercise 2.5 is to show:

    M1M2 − 〈M1,M2〉 is a mean zero martingale

    M1M2 − [M1,M2 is a mean zero martingale

    Bo Lindqvist Slides 5: Martingales ()STK4080 72 / 73

  • TWO COUNTING PROCESSES

    Let N1(t) and N2(t) be counting processes.

    Assume that they cannot jump simultaneously.

    EXERCISE 15

    Do exercise 2.10 in book: Show that

    〈M1,M2〉 (t) = 0 for all t

    [M1,M2](t) = 0 for all t

    i.e. M1 and M2 are orthogonal martingales.

    Hint: Argue that N1 + N2 is a counting process with

    〈M1 + M2〉 = 〈M1〉+ 〈M2〉

    [M1 + M2] = [M1] + [M2]

    Bo Lindqvist Slides 5: Martingales ()STK4080 73 / 73