Top Banner
Unit - III RANDOM PROCESSES B. Thilaka Applied Mathematics
227

RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Jul 11, 2018

Download

Documents

vancong
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Unit - III

RANDOM PROCESSES

B. Thilaka

Applied Mathematics

Page 2: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

A family of random variables {X(t,s)│tεT, sεS} defined over a given probability space and indexed

by the parameter t, where ‘t’ varies over the index set T, is known as random process/ chance

process/ stochastic process.

X: SxT→R

X(s,t)=x

Notation: X(t)

Page 3: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

The values assumed by the random variables X(t) are called the states and the set of all possible values is called the state space of the random process.

Since the random process {X(t)} is a function of both s and t, we have the following

Page 4: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

Observations:

1. s and t are fixed : X(t) –real number

2. t is fixed : X(t) –random variable

3. s is fixed : X(t) - function of time called the sample function or realisation of the process

4. s and t are varying : X(t) – random process

Page 5: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Classification

Three types

Type I

State Space, Parameter Space:

Discrete/ Continuous

Page 6: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Classification

S

T

Discrete Continuous

Discrete Discrete

random

sequence

Continuous

random

sequence

Continuous Discrete random

process

Continuous random

process

Page 7: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Classification Predictable/ Deterministic :

Future values can be predicted from past values.

eg: X(t)= A cos (wt+θ), where any non-empty combination of A, w, θ may be random variables

Unpredictable/ Non- deterministic:

Future values cannot be predicted from past values.

eg: Brownian motion

Page 8: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Classification

Stationarity

Probability for a random process: For a fixed time t1, X(t1) is a random variable that describes the state of the process at time t1.

The first order distribution function of a random process is defined as

Page 9: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

First order Distribution and Density Function

The first order density function of a random process is defined as

11111 )(; xtXPxFtxF XX

txFx

txf XX ;; 1

1

11

Page 10: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

First order Distribution Function

Statistical Average

The statistical average of the random process {X(t)} is defined as

provided the quantity on the RHS exists.

1111 ;)]([)( dxtxfxtXEt X

Page 11: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

First order stationary process

The random process {X(t)} is said to be a first order stationary process / stationary to order one if the first order density/ distribution function is invariant with respect to a shift in the time origin

(or)

,t,xt;xft;xf 1111X11X

,,;; 111111 txtxFtxF XX

Page 12: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

First order stationary process

Result:

The statistical mean of a first order stationary random process is a constant with respect to time.

Proof:

Let the random process {X(t)} be a first order stationary process .

Then its first order density function satisfies the property that

Page 13: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

First order stationary process

,t,xt;xft;xf 1111X11X

The mean of the random process {X(t)} at time t1 is defined as

11111 ;)]([ dxtxfxtXE X

Page 14: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

First order stationary process

Consider

Let t2 = t1 +∆. Then

21222 ;)]([ dxtxfxtXE

2122 ; dxtxfx

22222 ;)]([ dxtxfxtXE

Page 15: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

First order stationary process

Hence E[X(t)] is a constant with respect to time.

)]([)]([ tXEtXE

Page 16: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Second order Distribution function

The second order joint distribution function of a random process {X(t)} is defined as

The second order joint density function of the random process {X(t)} is defined as

22112121 )(,)(,;, xtXxtXPttxxFX

2121

21

2

2121 ,;,,;, ttxxFxx

ttxxf XX

Page 17: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Second order stationary process

A random process {X(t)} is said to be a second order stationary process/ stationary to order two if its second order distribution/ density function is invariant with respect to a shift in the time origin.

In other words,

(and / or)

,,,,,;,,;, 212121212121 ttxxttxxfttxxf XX

,,,,,;,,;, 212121212121 ttxxttxxFttxxF XX

Page 18: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Second order Processes

Auto-correlation function:

The auto-correlation function of a random process {X(t)} is defined as

provided the quantity on the RHS exists.

)t(X)t(XEt,tR 2121XX

Page 19: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Second order Processes

Significance of auto-correlation function

1. It provides a measure of similarity between two observations of the random process {X(t)} at different points of time t1 and t2.

2. It also defines how a signal is similar to a time-shifted version of itself

Page 20: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Second order Processes

Auto-covariance function:

The auto-covariance function of a random process {X(t)} is defined as

)]([)()]([)(),( 221121 tXEtXtXEtXEttCXX

)]([)]([)()]([)]([)()()( 21212121 tXEtXEtXtXEtXEtXtXtXE

)]([)]([)]([)]([)]([)]([()( 21212121 tXEtXEtXEtXEtXEtXEtXtXE

)]([)]([),(),( 212121 tXEtXEttRttC XXXX

Page 21: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Wide-sense Stationary Process

A random process {X(t)} is said to be WSS/ weakly stationary/ covariance stationary process if it satisfies the following conditions:

1. E[X(t)] is a constant with respect to time.

2. RXX(t1,t2) is a function of the length of the

time difference.

i.e. RXX(t1,t2)= RXX(t1-t2)

3. E[X2(t)]<∞.

Page 22: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Wide-sense Stationary Process

Remark:

Since

CXX(t1,t2)= CXX(t1-t2), the auto-covariance function CXX(t1,t2) is a function of the length of the time difference.

Hence, a WSS process is also called a covariance stationary process.

)]([)]([),(),( 212121 tXEtXEttRttC XXXX

Page 23: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Wide-sense Stationary Process

Alternately, a random process (second order) is said to be WSS/ weakly stationary process if it satisfies

Remark: A second order stationary process is a WSS process, but the converse need not be true.

XXXX Rt,tR

)]t(X[E

Page 24: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

nth order distribution/ density function

The nth order joint distribution function of a random process {X(t)} is defined as

The nth order joint density function of a random process {X(t)} is defined as

)(,..,)(,)(,..,,;,..,, 22112121 nnnX tXxtXxtXPtttxxxF

nnX

n

n

nnX tttxxxFxxx

tttxxxf ,..,,;,..,,...

,..,,;,..,, 2121

21

2121

Page 25: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

nth order stationary process

A random process {X(t)} is said to be nth order

stationary/ stationary to order n if the nth order density/ distribution function is invariant with

respect to a shift in the time origin

(i.e.)

(and / or)

,,..,,,,..,,

,,..,;,..,,..,,;,..,

2121

21212121

nn

nnXnnX

tttxxx

tttxxxftttxxxf

,,..,,,,..,,

,,..,;,..,,..,,;,..,

2121

21212121

nn

nnXnnX

tttxxx

tttxxxFtttxxxF

Page 26: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Strictly stationary Process

A random process is said to be strictly stationary (SSS)/ stationary in the strict sense if is stationary to all orders n

Aside: For a SSS- CDF/PDF (nth order) is invariant with respect to a shift in the time origin

Mean-constant

Auto-correlation, Auto-covariance- functions of length of time intervals.

Page 27: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Strictly stationary Process

Remark :

If a random process is stationary to order n, then it is stationary to all orders k≤n.

WHY ?

Page 28: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Evolutionary Process

Random processes which are not stationary to any order are called non-stationary/ evolutionary processes

Page 29: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Further Properties Cross correlation function:

The cross correlation function of two random processes {X(t)} and {Y(t)} is defined as RXY(t1,t2)=E[X(t1)Y(t2)]

Cross covariance function:

The cross correlation function of two random processes {X(t)} and {Y(t)} is defined as

CXY(t1,t2)=E{[X(t1)-E(X(t1))][Y(t2)-E(Y(t2))]}

=RXY(t1,t2)-E[X(t1)]E[Y(t2)]

Page 30: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Further Properties

Two random processes {X(t)} and {Y(t)} are said to be jointly WSS if

(i) both {X(t)} and {Y(t)} are each WSS

(ii) the cross-correlation function RXY(t1,t2)=E[X(t1)Y(t2)] is exclusively a function of the length of the time interval (t2-t1)

Page 31: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Independent Random Process

• A random process {X(t)} is said to be an independent random process if its nth order joint distribution function satisfies the property that

A similar condition holds for joint p.m.f./ p.d.f.

nn

nnXXXnnX

tttxxx

txFtxFtxFtttxxxF

,..,,,,..,,

),;().....;();(,..,,;,..,

2121

22112121

Page 32: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Process with independent increments

A random process {X(t)} is defined to be a process with independent increments if for all 0<t1< t2<… <tn<t, the random variables X(t2)- X(t1), X(t3)- X(t2),.., X(tn)- X(tn-1) are independent.

Page 33: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Time averages of a random process

Prelude:

The time average of a quantity f(t) (t- time) is defined as

Time average of a random process:

The time average of a random process {X(t)} is defined as

T

T

dttfTT

lttfA )(

2

1)]([

Page 34: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Time averages of a random process

Time auto-correlation function :

The time auto-correlation function of a random process {X(t)} is defined as

T

T

dttxTT

lttXA )(

2

1)]([

ttttdttxtxTT

lttXtXA

T

T

2121 ,,)()(2

1)]()([

Page 35: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Ergodic Process

A random process is said to be ergodic if its time averages are all equal to the corresponding statistical averages.

Examples ???

Page 36: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Process

Correlation coefficient of a random process :

The correlation coefficient (normalized auto-covariance function) of a random process {X(t)} is defined as the correlation coefficient between its random variables X(t1) and X(t2) for arbitrary t1 and t2 .

In other words,

Note: Var[X(t)] = CXX(t, t)

)]([)]([

),(),(

21

2121

tXVartXVar

ttCtt XX

XX

Page 37: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Processes

A random process {X(t)} is called a Markov process if for all t0<t1< t2<… <tn<t, the conditional distribution of X(t) given the values of X(t0), X(t1), X(t2),.., X(tn) depends only on X(tn).

nnnn110o x)t(Xx)t(XP]x)t(X,...,x)t(X,x)t(Xx)t(X[P

nnnn110o x)t(Xb)t(XaP]x)t(X,...,x)t(X,x)t(Xb)t(Xa[P

Page 38: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Processes

In other words, a random process {X(t)} is called a Markov process if the future values of the process depend only on the present and are independent of the past.

Examples: Binomial process, Poisson process, Random telegraph process

Page 39: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Processes

Note:

If the state space of a Markov process is discrete, then the Markov process is called a Markov chain.

Page 40: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Processes

State Space

Discrete Continuous

Time Discrete Discrete- Time

Markov Chain

Discrete- Time

Markov Process

Continuous Continuous - Time Markov

Chain

Continuous - Time Markov

Process

Page 41: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Processes

Page 42: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Processes

• If in a Markov chain, the conditional distribution is invariant with respect to a shift in the time origin, the Markov chain is said to be time homogeneous.

• In a homogeneous Markov chain, the past history of

the process is completely summarized in the current state.

Hence, the distribution of time that the process

spends in the current state must be memoryless.

Page 43: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Counting Process

A random process {X(t)} is called a counting process if X(t) represents the total number of “events” that have occurred in the interval [0,t).

1. X(t) ≥ 0; X(0)=0- begins at time t=0

2. X(t) is integer valued

3. If s ≤ t, then X(s) ≤ X(t) 4. X(t) - X(s) : number of events in [s,t]

Page 44: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Counting Process

Alternately, the random process {N(t)} is called a counting process if it assumes only integer values and it is an increasing function of time.

Page 45: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Types of Random Processes

Bernoulli process : {Xn:n≥1}, Xn’s are i.i.d. Bernoulli variates with parameter p.

Consider a sequence of independent and identical Bernoulli trials.

Let the random variable Yi denote the outcome of the ith trial so that the event {Yi=1} indicates success with probability p on the ith trial for all ‘i’ and the event {Yi=0} indicates failure with probability (1-p)=q on the ith trial for all ‘i’.

Page 46: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Bernoulli Process

Hence the Yi‘s may be considered as independently and identically distributed random variables.

The random process {Yn} is called a Bernoulli process with P[Yn=1]=p and P[Yn=0]=1-p, for all n.

Discussion: Classify Bernoulli process ( 3 types)

Page 47: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process

Binomial process :

Consider a Bernoulli process {Yi} where the Yi‘s are independently and identically distributed Bernoulli random variables with parameter p. Form another stochastic process {Sn} with Sn= X1+X2+..+Xn

The random variable Sn=follows a Binomial distribution with parameters n and p.

Page 48: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process

The random process {Sn:n≥1} is called a Binomial process.

Discussion: Preliminary classification of Binomial Process

Page 49: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process First order pmf:

The first order p.m.f. of the random process {Sn} is given by

Can you classify Binomial process even further??

nkppnCkSPknk

kn ,...,1,0,)1(

npSE n ][

npqpnpSVar n )1(][

Page 50: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process

Consider Sn+1= Y1+Y2+…+Yn+1

= Y1+Y2+…+Yn+Yn+1

= Sn+Xn+1

Hence

P[Sn+1= k│ Sn= k] = P[Yn+1= 0]=1-p

P[Sn+1= k+1│ Sn= k] = P[Yn+1= 1]=p

Can you now classify Binomial process even further?

Page 51: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process

The Binomial process is also called the Binomial counting process.

Probability generating function:

Second order joint p.m.f.

nS pzqzGn

)(

nllknmmkppkl

mn

k

mlSkSP

lnl

nm ,..,1,0,,,,..,1,0,)1(,

Page 52: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process

Observation:

The total number of trials T from the beginning of the process until and including the first success is a geometric random variable.

The number of trials after (i-1)th success upto and including the ith success will have the same distribution as T

Can you generalise?

Page 53: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process

If Tn is the number of trials upto and including the ith success, then Tn is the n-fold convolution of T with itself and follows a negative binomial distribution with E[Tn]=n/p and Var[Tn]=n/1-p.

Page 54: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Binomial Process

• If in the Binomial process {Sn}, n is large and p is small such that np is finite, then the Binomial process approaches a Poisson process with parameter λ=np.

Page 55: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Let E be any random event and {N(t)} denote the number of occurrences of the event E in an interval of length t.

Let pn(t)= P[N(t)=n].

The counting process {N(t)} is called a Poisson process if it satisfies the following postulates:

Page 56: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

1. Independence : {N(t)} is independent of the number of occurrences of E in an interval prior to (0,t) i.e. future changes in N{t} are independent of the past changes.

2. Homogeneity in time: pn(t) depends only on the length ‘t’ of the time interval and is independent of where the interval is situated i.e. pn(t) gives the probability of the number of occurrences of E in the interval (t0, t0 +t) for all t0

3. Regularity: In an interval of infinitesimal length h, the probability of exactly one event occurring is λh+o(h), the probability of zero events occurring is 1-λh+o(h), the probability of more than one event occurring is o(h).

Page 57: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Page 58: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Relax the postulates:

3. Regularity- Compound Poisson Process (multiple occurrences at any instant)

2. Homogeneity in time: λ is a function of time (λ(t))- Non-homogeneous

1. Independence: future depends on present- Markov process

Page 59: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Result : The Poisson process defined above follows Poisson distribution with mean λt. i.e.

Proof:

Let {N(t)} be a Poisson process with parameter λ. We now consider

,.....2,1,0n,t!n

e]n)t(N[P)t(p

nt

n

Page 60: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

(Theorem on total probability)

(homogeneity)

])([)( nttNPttpn

n

k

ktNPktNnttNP0

])([])(/)([

n

k

ktNPkntNttNP0

])([])()([

n

k

ktNPkntttNP0

])([]),([

Page 61: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

Assuming ∆t to be of infinitesimal length, we have

])([])([0

ktNPkntNPn

k

)(])([])([

]1)([]1)([]0)([])([

2

0

tokntNPktNP

tNPntNPtNPntNP

n

k

2

0

110 )()()()()()(m

k

knknn tptptptptptp

Page 62: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

(Regularity)

Dividing throughout by ∆t

2

0

1 )()()()()(1)()(n

k

knnn totptottptottpttp

)()()()()()()()(

2

0

1 tototptottpttptpttpn

k

knnnn

t

totptp

t

tpttpnn

nn

)()()(

)()(1

Page 63: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

Taking limits as ∆t →0 on both sides of the above equation,

--------(1)

At n=0, we have

1,1 ntptpdt

tdpnn

n

]0)([)(0 ttNPttp

]0)([]0)(/0)([ tNPtNttNP

Page 64: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

(homogeneity)

(regularity)

Taking limits as ∆t →0 on both sides of the above equation,

]0)([]0)()([ tNPtNttNP

]0)([]0),([ tNPtttNP

]0)([]0)([ tNPtNP

)()(1 0 tptot

t

totp

t

tpttp

)()(

)()(0

00

Page 65: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

-----(2)

We now solve the above system of differential difference equations (1) and (2) subject to the initial conditions

-------(3)

Consider equation (2) namely,

tpdt

tdp0

0

10)0(,1)0(0 npp n

Page 66: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

subject to

At t=0,

-----(4a)

tpdt

tdp0

0 1)0(0 p

dttp

tdp

0

0

cttp 0log

ctetp

0

tKetp

0

Kp 100

tetp

0

Page 67: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

Substituting n=1 in equation (1), we have

subject to

On solving the above equation, we have

tptpdt

tdp01

1

tetp

dt

tdp 11

0)0(1 p

0)0(, 111

petpdt

tdp t

cdteeetpdttdt

1

cdteeetpttt 1

Page 68: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

At t=0,

-----(4b)

Substituting n=2 in equation (1), we have

ttcetetp

)(1

)0()0(

1 )0(0 ceep

0c

)(1 tetpt

tptpdt

tdp12

2

Page 69: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

.

0)0(),( 222

ptetpdt

tdp t

cdteteetpdttdt

2

cdteteetpttt 2

2

ct

2

22

Page 70: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

At t=0,

-----(4c)

Proceeding in this manner, we have

ttce

tetp

2

22

2

)0()0(

2 )0(0 ceep

0c

!2

)()(

2

2

tetp

t

Page 71: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes- pmf

Hence, the number of events {N(t)} in an interval of length t follows a Poisson distribution with mean λt.

Can you classify Poisson process??

,.....2,1,0,!

])([)( ntn

entNPtp

nt

n

Page 72: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Processes

• Poisson process is an evolutionary process

• E[N(t)]=λt and Var[N(t)]=λt . Further,

Hence λ is called the arrival rate of the process

0t

)t(NVarlt,

t

)t(NElt

tt

Page 73: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Characterization: If {N(t)} is a Poisson process with mean λt, then the –occurrence/ inter-arrival time follows an exponential distribution with mean 1/ λ.

Proof: Let {N(t)} be a Poisson process with parameter λ. Then

,.....2,1,0n,t!n

e)t(p

nt

n

Page 74: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

If W denotes the time between two successive arrivals/ occurrences of the event, the CDF of W is

The above is the CDF of an exponential variate with

mean 1/λ

]0)w(N[P1

]wW[P1]wW[PwFW

w

W e1wF

Page 75: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

If the inter-arrival times are i.i.d. (not necessarily exponential), then we have a renewal process.

Page 76: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process- Properties

• Superposition

• Decomposition

• Markov Process

• Difference of two Poisson processes is not Poisson.

Page 77: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process- Properties

Result:

The superposition of n independent Poisson processes with means λ1t, λ2t, …., λnt, respectively is a Poisson process with mean λ1t+ λ2t+ …. +λnt.

Proof:

Consider n independent Poisson processes N1(t), N2(t),…, σn(t), with respective means λ1t, λ2t, ….. λnt.

Page 78: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process- Properties

The moment generating function(m.g.f.) of each Nk(t) is given by

By property of moment generating functions, the mg.f. of the sum

is given by

)(

)( )(tN

tNk

keEM

)1( etke

n

k

k tNtN1

)()(

Page 79: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process- Properties

which is the moment generating function of a Poisson distribution with mean λ1t+ λ2t+ …. +λnt.

n

k

tNtN kMM

1

)()( )()(

n

k

etke1

)1(

n

k

k et

e 1

)1(

Page 80: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process- Properties

Hence by uniqueness property, the sum of n independent Poisson processes with means λ1t, λ2t, …., λnt, respectively is a Poisson process with mean λt=λ1t+ λ2t+ …. +λnt.

Page 81: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process- Properties

Decomposition of Poisson process:

A Poisson process N(t) with mean arrival rate λ can be decomposed into ‘n’ mutually independent Poisson processes with arrival rates p1 λ, p2 λ,.., pn λ such that

p1 + p2 +… + pn =1

Proof: HW

Page 82: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Note:

If N(t) is a Poisson process with mean arrival rate λ, then the time between

‘k’ successive arrivals/ occurrences follows a

(k-1) Erlang distribution .

WHY??

Page 83: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Second order joint p.m.f. :

Given a Poisson process with mean arrival rate λ, the second order joint p.m.f. is obtained as follows:

21211111222211 ,,)()()()(,)( nnttntNPntNntNPntNntNP

2121121211 ,,)()()( nnttnntNtNPntNP

2121122111 ,,),()( nnttnnttNPntNP

Page 84: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(homogeneity)

2121121211 ,,)()( nnttnnttNPntNP

2121

12

)(

12

)(

1

1 ,,!

)(

!

122111

nnttnn

tte

n

tennttnt

Page 85: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Hence

elsewhere

nnttnnn

ttte

ntNntNP

nnnnt

,0

,,!!

)(

)(,)( 2121

121

)(

121

2211

12122

Page 86: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Similarly the third order joint p.m.f. of the Poisson process with mean arrival rate λ is given by

Page 87: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

.

elsewhere

nnntttnnnnn

ttttte

ntNntNntNP

nnnnnnt

,0

,,)!(!!

)()(

)(,)(,)( 321321

23121

)(

23

)(

121

332211

2312133

Page 88: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Auto-correlation function :

If N(t) is a Poisson process with mean arrival rate λ, then

• E[N(t)] = λt

• Var[N(t)] = λt

• E[N2(t)] = Var[N(t)] + E[N(t)] = λt + λ2t2

Page 89: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

The auto-correlation function of N(t) is now obtained as follows:

(by definition) )]()([),( 2121 tNtNEttRNN

211121 ,)()()()( tttNtNtNtNE

211

2

121 ,)()()()( tttNtNtNtNE

211

2

211 ,)(),()( tttNEttNtNE

Page 90: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(independence)

211

2

121 ,)()()( tttNEttNtNE

211

2

121 ,)()()( tttNEttNEtNE

21

2

1112121 ,)]([),( tttttttttRNN

21121

2 , ttttt

2121

2

21 ,min),( ttttttRNN

Page 91: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Auto-covariance function of a Poisson process :

The auto-covariance function of a Poisson process N(t) with mean arrival rate λ is

(by definition)

)]([)]([),(),( 212121 tNEtNEttRttC NNNN

212121

2 ,min tttttt

2121 ,min),( ttttCNN

Page 92: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

The correlation function of a Poisson process N(t) with mean arrival rate λ is given by

)]([)]([

),(,

21

21

21tNVartNVar

ttCtt NN

NN

21

21 ,min

tt

tt

21

21 ,min

tt

tt

Page 93: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Hence

21

2

121 ,, tt

t

tttNN

Page 94: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

Process with stationary increments:

A random process {X(t)} is said to be a process with stationary increments if the distribution of the increments X(t+h)-X(t) depends only on the length ‘h’ of the interval and not on end points.

Page 95: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

Wiener process/ Wiener-Einstein Process/ Brownian Motion Process:

A stochastic process {X(t)} is said to be a Wiener Process with drift µ and varaince

2, if

(i) X(t) has independent increments

(ii) every increment X(t)-X(s) is normally distributed with mean µ(t-s) and variance

2(t-s)

Page 96: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

1. For a Poisson process with parameter and for show that

Solution : Consider

(defn)

nkt

s

t

snCntNksNP

knk

k ,...2,1,0,1)()(

ntNksNP )()(

ntNP

ntNksNP

)(

)()(

Page 97: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(by definition)

ntNP

knstNksNP

)(

)()(

ntNP

knstNPksNP

)(

)()(

!

)(

)!(

))((

!

)( )(

n

te

kn

ste

k

se

nt

knstks

Page 98: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

nnt

knknstkks

te

steese

knk

n

)(

)!(!

!

nn

kn

knkn

kt

t

sts

nC

1

n

kn

knk

kt

t

stts

nC

1

Page 99: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Hence the proof.

kn

k

k

kt

s

t

snC

1

knk

kt

s

t

snC

1

ntNksNP )()( nkt

s

t

snC

knk

k ,....2,1,0,1

Page 100: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

2. Suppose that customers arrive at a bank according to a Poisson process with a mean rate of 3 per minute. Find the probability that during a time interval of 2 minutes (a) exactly 4 customers arrive

(b) more than 4 customers arrive.

Solution: Let N(t) be a Poisson process with mean arrival rate λ.

Page 101: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Given that λ=3.

Hence

(a)

ktNP )(

,..,2,1,0,!

)(k

k

tekt

4)2( NP!4

)2.3( 4)2(3

e

133.0!4

646

e

Page 102: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(b)

= 0.715

4)2( NP 4)2(1 NP

4)2(3)2(2)2(

1)2(0)2(1

XPXPXP

XPXP

!4

)6(

!3

)6(

!2

)6(

!1

)6(

!0

)6(1

4636261606eeeee

Page 103: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

3. If a customer arrives at a counter according to a Poisson process with a mean rate of 2 per minute, find the probability that (i) 5 customers arrive in a 10 minute period (ii) the interval between 2 successive arrivals is (a) more than 1 minute (b) between 1 and 2 minutes (c) 4 minutes or less (iii) the first two customers arrive within 3 minutes (iv) the average number of customers arriving in 1 hour.

Page 104: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Solution:

Let N(t) denote the number of customers who arrive at the counter in an interval of length ‘t’. We are given that σ(t) follows a Poisson process with mean arrival rate λ=2 per minute.

Therefore we have,

,.....2,1,0,!

])([ ntn

entNP

nt

Page 105: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(i)

(ii) Since N(t) is a Poisson process with mean arrival rate λ=2, the inter-arrival time T between 2 successive arrivals follows an exponential distribution with p.d.f.

!5

)102(5)10(

5)10(2xe

NP

!5

)20( 520

e

Page 106: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(a)

0,2)( 2 tetf

t

1

22)1( dteTPt

1

2

22

te

201 e

1353.0

Page 107: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(b) 2

1

22)21( dteTPt

2

1

2

22

t

e

241 ee

42 ee

0183.01353.0

1170.0

Page 108: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(c) P(T≤ 4) = 1-e-2(4)=1-e-8=0.9996

(iii) Since N(t) is a Poisson process with mean arrival rate λ, we know that the inter-arrival time follows an exponential distribution with mean 1/λ.

If Ti denotes the inter arrival time between the (i-1)th customer and the ith customer, then the time taken for the first

Page 109: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

2 customers to arrive is given by T1+T2.

Since T1 andT2 are independently and identically distributed exponential variates with parameter λ, T1+T2 follows a second order Erlang distribution with p.d.f.

0,0,)2(

2

tte

t

0,2 22 tte

t

Page 110: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

=0.9826

dtteTTPt

3

0

2

21 4]3[

3

0

23

0

2

424

ttete

4

1

2

034

66ee

17 6 e

Page 111: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

(iv) Since E[N(t)]= λt,

the average number of customers arriving in one hour is E[N(60)]= 2x60 = 120.

4. A radioactive source emits particles at the rate of 5 per minute in accordance with a Poisson process. Each particle has a probability of 0.6 of being recorded. Find the probability that 10 particles are recorded in a 4 minute period.

Page 112: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Solution:

Given that the emission of particles follows a Poisson process with arrival rate λ=5 per minute. From the decomposition property of Poisson process, the number of emitted particles N1(t) follows a Poisson process with mean arrival rate λ1= λp =5x0.6/ min

i.e. λ1=3 per minute

Page 113: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Hence

,.....2,1,0,!

])([ 11

1

ntn

entNP

nt

!10

)12(

!10

)43(10)4(

101210)4(3

1

exe

NP

Page 114: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

5. A machine goes out of order whenever a component fails. The failure of this part follows a Poisson process with a mean rate of 1 per week. Find the probability that 2 weeks have elapsed since last failure. If there are 5 spare parts of this component in an inventory and that the next supply is not due in 10 weeks, find the probability that the machine will not be out of order in the next 10 weeks.

Page 115: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

Solution:

Let X(t) denote the number of failures of the component in t units of time.

Then X(t) follows a Poisson process with

mean failure rate= mean number of failures in a week = λ = 1

P[2 weeks have elapsed since last failure] = P[ Zero failures in the 2 weeks since last failure] =P[X(2)=0]

Page 116: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

=e-2(1) =e-2 =0.135

There are only 5 spare parts and the machine should not go out of order in the next 10 weeks.

Hence

P[ the machine will not be out of order in the next 10 weeks] = P[X(10)≤5]

!0

)2(0]P[X(2)

02

e

Page 117: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Poisson Process

!5

10

!4

10

!3

10

!2

10

!1

10

!0

10510410310210110010

eeeeee

P[X(10)≤5]=0.068

120

100000

24

10000

6

1000

2

10010110

e

Page 118: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

6. Determine the mean and variance of the random process {X(t)} is given by

Verify whether {X(t)} is stationary or not.

01

,..3,2,1,)1(

)(

)(1

1

nat

at

nat

at

ntXPn

n

Page 119: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Solution:

Consider the random process {X(t)} defined by

Since the first order p.m.f. of X(t) is a function of ‘t’, the random process {X(t)}

01

,..3,2,1,)1(

)(

)(1

1

nat

at

nat

at

ntXPn

n

Page 120: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

is not a stationary process. It is an evolutionary process.

Now, the mean of the process is given by

(by definition)

0

])([)(n

ntXnPtXE

...)1(

)(3

)1(2

)1(

11

10

4

2

32

at

at

at

at

atat

at

....

)1(

)(3

121

)1(

12

2

2at

at

at

at

at

Page 121: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

E[X(t)]=1

We now compute Var[X(t)]

....1

31

21)1(

12

2 at

at

at

at

at

2

2 11

)1(

1

at

at

at

2

2 1

1

)1(

1

at

atat

at

2

2 1

1

)1(

1

atat

2

2)1(

)1(

1at

at

Page 122: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Var [X(t)]=E[X2(t)]-{E[X(t)]}2.

Consider

0

22 ])([)(n

ntXPntXE

0

2 ])([][n

ntXPnnn

00

])([])([)1(nn

ntXnPntXPnn

1....)1(

)(43

)1(32

)1(

121

10

4

2

32

at

atx

at

atx

atx

at

at

1....1

101

61

31)1(

232

2

at

at

at

at

at

at

at

Page 123: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

E[X2(t)]=2+2at-1

=3at-1

11

1)1(

23

2

at

at

at

11

1

)1(

23

2

at

atat

at

1)1()1(

2 3

2

at

at

Page 124: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Var[X(t)]=2at+1-1

Var[X(t)]=2at.

7. Show that the random process {X(t)} defined by. where A and ω are constants and θ is a uniform random variable over (0,2π) is wide sense stationary. Further, determine whether {X(t)} is mean ergodic, correlation ergodic.

)cos()( tAtX

Page 125: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Solution :

A random process {X(t)} is said to be WSS if it satisfies the following conditions:

1. E[X(t)] is a constant with respect to time.

2. The auto-correlation function RXX(t1,t2) is a function of the length of the time difference. i.e. RXX(t1,t2)= RXX(t1-t2)

3. E[X2(t)]<∞.

Page 126: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Consider the random process given by

where A and ω are constants and θ is uniform distributed in (0,2π). Hence, the p.d.f. of θ is given by

Consider

)cos()( tAtX

elsewhere

f

,0

20,2

1

)(

)cos()]([ tAEtXE

Page 127: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

E[X(t)]=0, a constant with respect to time.

----------(1)

)cos( tAE

2

02

1)cos( dtA

2

0)sin(

2 t

A

ttA

sin2sin2

Page 128: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

We next consider the auto-correlation function RXX(t,t+ )=E[X(t)X(t+ )]

)}{cos()cos( tAtAE

)cos()cos(2

2

ttttEA

)cos()22cos(2

2

tEA

)cos(2

)22cos(2

22

EA

tEA

Page 129: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Consider

------(2)

Substituting the above expression in RXX(t,t+ ), we have

)cos(2

)22cos(2

22

AtE

A

dttE 2

02

1)22cos()22cos(

2

02

)22sin(

2

1

t

0)2sin()22sin(4

1

tt

Page 130: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

-----(3)

Consider E[X2(t)]= RXX(t,t)= RXX(0) <∞. Hence, the given random process is WSS.

Now, the random process {X(t)} is said to be mean ergodic if E[X(t)] = A[X(t)],

where

)cos(2

),(2

AttRXX

2

2A

T

T

dttxTT

lttXA )(

2

1)]([

Page 131: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Consider

= 0 (since │sinθ│≤1 for all θ) ----(4)

T

T

dttATT

lttXA )cos(

2

1)]([

T

T

dttTT

ltA )cos(

2

1

T

T

t

TT

ltA

)sin(

2

1

)sin()sin(2

1

TTTT

ltA

Page 132: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

From equations (1) and (4), we see that E[X(t)] = A[X(t)].

Hence {X(t)} is mean ergodic.

The random process {X(t)} is said to be correlation ergodic if

E[X(t)X(t+ )] = A [X(t)X(t+ )] where

T

T

dttxtxTT

lttXtXA )()(

2

1)]()([

Page 133: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Consider

T

T

dttAtATT

lttXtXA )}{cos()cos(

2

1)]()([

T

T

dttttt

TT

ltA

2

)cos()cos(

2

12

T

T

T

T

dtTT

ltAdtt

TT

ltA)cos(

2

1

2)22cos(

2

1

2

22

T

T

T

T

dtTT

ltAt

TT

ltA)cos(

2

1

22

)22sin(

2

1

2

22

Page 134: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

----(5)

From equations (2) and (5), we see that E[X(t)X(t+ )] = A [X(t)X(t+ )]

)(2

1

2

)cos(

2

)22sin()22sin(

4

2

2

TTTT

ltA

T

TT

T

ltA

,1sin1

2

)cos(0

2

T

ltA

)cos(2

),(2

AttAXX

Page 135: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Hence the given random process {X(t)} is correlation ergodic.

Note: Since the above process is WSS, it is first order stationary.

8. Show that the process

where A and B are uncorrelated random variables is wide sense stationary if

tBtAtX sincos)(

Page 136: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Solution : HW

9. Determine the mean and the variance of the random process where A and ω are constants and θ is uniformly distributed in

Solution : HW

22;0 BEAEBEAE

)cos()( tAtX

2,0

Page 137: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

10. Two random processes X(t) and Y(t) are defined by and

. Show that X(t) and Y(t) are jointly wide sense stationary, if A and B are uncorrelated zero mean random variables having the same variance and is a constant.

Solution: HW

tBtAtX 00 sincos)( tAtBtY 00 sincos)(

0

Page 138: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

11. If X(t) is a WSS process with autocorrelation function RXX( )=Ae-2│ │

where A is any constant , obtain the second order moment of the random variable X(8)-X(5).

Solution : HW

12. Given a random variable Y with characteristic function φ(ω)=E[eiYω] and a

a random process X(t)=Cos(λt+Y). Show

Page 139: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

that X(t) is WSS if φ(1)=φ(2)=0.

Solution :

A random process {X(t)} is said to be WSS if it satisfies the following conditions:

1. E[X(t)] is a constant with respect to time.

2. The auto-correlation function RXX(t1,t2) is a function of the length of the time difference. i.e. RXX(t1,t2)= RXX(t1-t2)

3. E[X2(t)]<∞.

Page 140: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Since φ(ω) is the characteristic function of the random variable Y, we have

φ(ω)=E[eiYω] =E[cos (Yω) +i sin (Yω)]

Since φ(1)=0, we have

E[cos (Y) +i sin (Y)]=0

E[cos Y] + i E[sin Y]=0 (WHY??)

This implies that

E[cos Y] =0 and E[sin Y] =0 (WHY??)

-------(1)

Page 141: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Also, φ(2)=0 yields

E[cos (2Y) +i sin (2Y)]=0

E[cos 2Y] + i E[sin 2Y]=0

This implies that

E[cos 2Y] =0 and E[sin 2Y] =0

------(2)

Consider E[X(t)]=E[Cos(λt+Y)]

= E[cos λt cos Y – sin λt sin Y]

= cos λt E[cos Y] – sin λt E[sin Y]

= 0 (from (1)), a constant.

Page 142: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Now consider RYY(t, t+ )=E[X(t)X(t+ )]

Therefore,

RXX(t, t+ )= E[Cos(λt+Y) Cos(λ[t+ ]+Y)]

=E[{Cos({λt+Y}+{ λ[t+ ]+Y})

+Cos({λt+Y }-{λ[t+ ]+Y})}/2]

= E[cos(2λt+ λ +2Y)]/2+ E[cos(- λ )]/2 (why ??)

= E[cos (2λt+ λ ) cos 2Y- sin (2λt+ λ ) sin 2Y]/2

+ E[cos(λ )]/2 (why???)

= cos (2λt+ λ )E[cos 2Y]/2

- sin (2λt+ λ )E[sin 2Y]/2+ E[cos(λ )]/2

Page 143: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

Hence RXX(t, t+ )= E[cos(λ )]/2 (from (2)), is exclusively a function of the length of the time interval and not the end points.

Further, E[X2(t)]= RXX(t,t)= E[cos(λx0)]/2

= 1/2 )]<∞.

Hence the given random process {X(t)} is WSS

ω θω θ

π π ππ

Page 144: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

HW

13. Verify whether the random process X(t)=ACos(ωt+θ) is WSS given that A and ω are constants and θ is uniformly distributed in (i) (-π, π) (ii) (0, π) (iii) (0, π/2).

Page 145: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

14. If X(t)= Y cos ωt + Z sin ωt, where Y and Z are two independent normal RVs with E[X]=E[Y]=0,E[X2]=E[Y2]= 2 and ω is a constant, prove that {X(t)} is a stationary process of order 2.

Solution: HW

Page 146: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Stationary Processes

15. Verify whether the random process X(t)=Ycos ωt where ω is a constant and Y is a uniformly distributed random variable in (0,1) is a strict sense stationary process.

Solution : HW

Page 147: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Sine Wave Processes

Definition: A random process {X(t)} of the form X(t) = A cos (ωt+θ) or X(t) = A sin(ωt+θ) where any non-empty combination of A, ω, θ are random variables is called a sine wave process.

A is called the amplitude

ω is called the frequency

θ is called the phase.

Page 148: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

Orthogonal Processes:

Two random processes {X(t)} and {Y(t)} are said to be orthogonal if their cross correlation function RXY(t1,t2)=0.

Uncorrelated Processes:

Two random processes {X(t)} and {Y(t)} are said to be uncorrelated if their cross covariance function CXY(t1,t2)=0.

i.e. RXY(t1,t2)=E[X(t1)]E[Y(t2)]

Page 149: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

Note: Two independent random processes are uncorrelated but the converse need not be true. (Can you give a Counter-example???)

Remark: If two random processes {X(t)} and {Y(t)} are statistically independent, then their cross-correlation function is given by RXY(t1,t2)=E[X(t1)]E[Y(t2)]

Page 150: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Processes

Remark:

If two random processes {X(t)} and {Y(t)} are at least WSS, then RXY(t1,t2)=XY, a constant with respect to time.

Page 151: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

Definition:

A random process {X(t)} is called a Normal/ Gaussian process if its nth order joint density function is given by

TX XXCXX

n

X

nnX eC

tttxxxf][][

2

1

2121

1

)2(

1,..,,;,..,,

Page 152: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

where

CX is the covariance matrix given by

T

nn XX

XX

XX

XX

.

.

22

11

nnnn

n

n

X

CCC

CCC

CCC

C

..

........

..

..

21

22221

11211

Page 153: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

where Ci,j=CXX(ti,tj) is the auto-covariance function of X(t). Also, is the transpose of the matrix where

= E[X(ti)].

Remarks:

1. A Gaussian process is completely determined by its mean and auto-covariance functions. (WHY???)

XX

TXX

iX

Page 154: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

2. If a Gaussian process is WSS, then it is strictly stationary. (WHY???)

(Hint: If {X(t)} is WSS, then its mean is a constant with respect to time and its auto-covariance function is only a function of the length of the time interval and not on the end points.)

Page 155: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

16. If {X(t)} is a Gaussian process with mean µ(t)=0 and the auto-covariance function CXX(t1,t2)=16e -│t

1-t2

│ . Find the probability that X(10)≤8 and │X(10)-X(8)│≤ 4.

Solution: Consider the Gaussian process with mean µ(t)=0 or E[X(t)]=0 and the auto-covariance function

CXX(t1,t2)=16e -│t1-t

2│ .

Page 156: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

The random variable where ti

is any fixed time point follows a standard normal distribution.

),(

)]([)(

iiXX

ii

ttC

tXEtX

16

108

10,10

)]10([)10(8)10(

XXC

XEXPXP

]5.0[4

2

ZPZP

3085.06915.01

Page 157: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

Consider the random variable X(10)-X(8) .

Then E[X(10)-X(8)]=E[X(10)]-E[X(8)]=0.

Var[X(10)-X(8)]

=Var[X(10)]+Var[X(8)]-2Cov[X(8),X(10)]

=16 + 16 – 2x16e–│8-10│

= 32 – 32 e-2

= 32(0.8646) = 27.6692

Page 158: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Normal/ Gaussian Process

= 2P[(Z<0.76)-0.5]

= 2[0.7764-0.5]

= 0.5528

6602.27

044)8()10( ZPXXP

76.0 ZP

]76.00[2 ZP

Page 159: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Let {X(t)} be a random process satisfying the following conditions:

1. {X(t)} assumes only one of the 2 possible levels +1 or -1 at any time.

2. X(t) switches back and forth between its two levels randomly with time.

3. The number of level transitions in any time interval of length is a Poisson random variable. i.e. the probability of

Page 160: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

exactly k transitions with average rate of transitions λ is given by

4. Transitions occurring in any time interval are statistically independent of the transitions in any other interval.

5. The levels at the start of any interval are equally probable.

X(t) is called a semi-random telegraph signal process.

,....2,1,0,!

)(

kk

ek

Page 161: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Alternately, if N(t) represents the number of occurrences of a specified event in (0,t), N(t) is Poisson with parameter λt and

Y(t)= (-1) N(t), then the random process Y(t) is called a semi-random telegraph signal process.

If Y(t) is a semi-random telegraph signal process, A is a random variable which is

Page 162: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

independent of Y(t) and assumes values +1 and -1 with equal probability, then the ransom process {X(t)} defined by X(t)=AY(t) is called a random telegraph signal process

We now show that the random telegraph process defined above is a WSS process.

To show that X(t) is WSS, we need to prove the following:

Page 163: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

1. E[X(t)] is a constant with respect to time.

2. The auto-correlation function RXX(t1,t2) is exclusively a function of the length of the time difference. i.e. RXX(t1,t2)= RXX(t1-t2)

3. E[X2(t)]<∞.

Consider E[X(t)] = E[AY(t)]

Page 164: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

E[X(t)] = E[A]E[Y(t)] (A and Y(t) are independent).

Consider E[A] = (1) P(A=1) + (-1) P(A=-1)

= (1)(½) + (-1)(½)

= 0 -------------(1)

Now, E[A2] = (1)2 P(A=1) + (-1)2 P(A=-1)

= (1)(½) + (1)(½)

= 1 --------------(2)

Page 165: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

By definition of Y(t), Y(t) takes the value +1 whenever the number of level transitions N(t) in the interval (0,t) is even and Y(t) assumes the value -1 whenever the number of level transitions N(t) in the interval (0,t) is odd.

Hence P[Y(t)=1] = P[N(t)= even]

= P[N(t)= 0] + P[N(t)= 2]

+ P[σ(t)= 4] + …..

Page 166: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

P[Y(t)=1] = e–λt cosh (λt) -------(3)

....!4

)(

!2

)( 42

ee

e

....

!4

)(

!2

)(1

42 e

eee2

1

Page 167: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Also, P[Y(t)=-1] = P[N(t)= odd]

= P[N(t)= 1] + P[N(t)= 3]

+ P[σ(t)= 5] + …..

P[Y(t)= -1] = e–λt sinh (λt) -------(4)

.....!3

)(

!1

)( 31

ee

....

!3

)(

!1

)( 31 e

Page 168: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

E[Y(t)] = (1) e–λt cosh (λt) + (-1) e–λt sinh (λt)

E[Y(t)] = e–2λt ------(5)

Substituting expressions (1) and (5) in E[X(t)], we obtain E[X(t)] = 0, which is a constant with respect to time.

22

ee

eee

e

Page 169: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Hence E[X(t)] is a constant with respect to time.

We next consider the auto-correlation function of X(t), viz,

RXX(t,t+ )= E[X(t)X(t+ )]

= E[AY(t) AY(t+ )]

= E[A2Y(t)Y(t+ )]

Page 170: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

RXX(t,t+ )= E[A2]E[Y(t)Y(t+ )]

( A is independent of Y(t))

= (1) RYY(t,t+ )

We now compute RYY(t,t+ )= E[Y(t)Y(t+ )]

If Y(t) = 1, then Y(t+ ) = 1, if the number of level transitions in the interval (t, t+ ) is even.

Page 171: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Hence

P[Y(t, t+ ) = 1 │Y(t) = 1] = P[ number of level transitions in (t, t+ ) is even]

= e–λ cosh (λ )

P[Y(t) = 1 ,Y(t, t+ ) = 1]

= P[Y(t, t+ ) = 1 │Y(t) = 1] P[Y(t) = 1] = e–λ cosh (λ ) e–λt cosh (λt) -----(7a)

Page 172: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Similarly,

P[Y(t) = 1 ,Y(t, t+ ) = -1]

= e–λ sinh (λ ) e–λt cosh (λt) -----(7b)

P[Y(t) = -1 ,Y(t, t+ ) = 1]

= e–λ sinh (λ ) e–λt sinh (λt) -----(7c)

P[Y(t) = -1 ,Y(t, t+ ) = -1]

= e–λ cosh (λ ) e–λt sinh (λt) -----(7d)

Page 173: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Hence,

RYY(t,t+ )= (1)(1) e–λ cosh (λ ) e–λt cosh (λt)

+ (1)(-1) e–λ sinh (λ ) e–λt cosh (λt)

+ (-1)(1) e–λ sinh (λ ) e–λt sinh (λt)

+ (-1)(-1) cosh (λ ) e–λt sinh (λt)

= e–λ e–λt {cosh (λ ) cosh (λt)

- sinh (λ ) cosh (λt) - sinh (λ ) sinh (λt)

+ cosh (λ ) sinh (λt)}

Page 174: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

RYY(t,t+ ) = e–λ e–λt {cosh (λ ) [cosh (λt)

+ sinh (λt)] - sinh (λ ) [cosh (λt) + sinh (λt)]}

= e–λ e–λt [cosh (λ ) - sinh (λ )] [cosh (λt) +

sinh (λt)]

= e–λ e–λt e–λ eλt

RYY(t,t+ ) = e–2λ -------(8)

Substituting equation (8) in

RXX(t,t+ ) = RYY(t,t+ ) we have

Page 175: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

RXX(t,t+ ) = e–2λ , which is exclusively a function of .

Further E[X2(t)] = RXX(t,t) = 1 < ∞. Hence, the random telegraph process X(t)

is a WSS process.

Page 176: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Random Telegraph Process

Remark:

From equations (3), (4), (5) and (8) we see that even though the auto-correlation function of the semi-random telegraph signal {Y(t)} is exclusively a function of , since the first order p.m.f is itself a function of , {Y(t)} is not even a first order stationary process. It is an evolutionary process.

Page 177: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

Discrete-Time Markov Chains:

Without loss of generality, assume that the parameter space T={0,1,2,,….}. Hence the state of the system is observed at the time points 0,1,2,…… These observations are denoted by X0,X1,X2,…

If Xn=j, then the state of the system at time step ‘n’ is said to be ‘j’.

Let pj(n)=P[Xn=j] denote the probability that Xn is in state j.

Page 178: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

In other words, pj(n) denotes the p.m.f. of the random variable Xn.

The conditional p.m.f.

pjk(m,n)= P[Xn=k│Xm=j] is called the transition p.m.f.

Page 179: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

Discrete-Time Markov Chains:

Let {Xn} be a discrete-time integer valued Markov chain (starting at n=0) with initial PMF

The joint PMF for the first n+1 values of the process is

,...2,1,0j],jX[P)0(p 0j

]iX[PiXiXP]...iX/iX[PiX,..,iX,iXP 0000111n1nnn001n1nnn

Page 180: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

If the one-step state transition probabilities are fixed and do not change with time, i.e.

{Xn} is said to have homogeneous transition probabilities.

A Markov chain is said to be a homogeneous Markov chain if the transition p.m.f. pjk(m,n) depends only on the difference n-m.

np]iX/jX[P ijn1n

Page 181: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

The homogeneous state transition probability satisfies the following conditions:

since the states are mutually exclusive and collectively exhaustive

n,..,2,1i,1p

1p0

j

ij

ij

Page 182: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain The transition probability matrix P : The one- step transition probabilities of a discrete

parameter Markov Chain are completely specified in the form of a transition probability matrix (t.p.m.) given by P=[pij]

The row sum is 1 for each row of P – Stochastic matrix

All the entries lie in [0,1]

The stochastic matrix is said to be doubly stochastic iff all the column entries add up to 1 for every column.

A stochastic matrix is said to be regular if all its entries are positive.

.

.ppp

......

...ppp

...ppp

P

2i1i0i

121110

020100

Page 183: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

The joint PMF of Xn , Xn-1 ,…, X0 is given by

Thus {Xn} is completely specified by the initial PMF and the matrix of the one-step transition probabilities P

)0(pp...piX,..,iX,iXP0,101n ii,ini,i001n1nnn

Page 184: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

• Two aging computers are used for word processing.

• When both are working in morning, there is a 30% chance that one will fail by the evening and a 10% chance that both will fail.

• If only one computer is working at the beginning of the day, there is a 20% chance that it will fail by the close of business.

• If neither is working in the morning, the office sends all work to a typing service.

• Computers that fail during the day are picked up the following morning, repaired, and then returned the next morning.

• The system is observed after the repaired computers have been returned and before any new failures occur.

Discrete-Time MC Computer Repair Example

Page 185: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

States for Computer Repair Example

Index State State definitions

0

s = (0)

No computer has failed. The office

starts the day with both computers

functioning properly.

1

s = (1)

One computer has failed. The

office starts the day with one

working computer and the other in

the shop until the next morning.

2

s = (2)

Both computers have failed. All

work must be sent out for the day.

Page 186: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Events and Probabilities for Computer Repair Index

Current state

Events

Probability

Next state

0 s0 = (0)

Neither computer fails. 0.6 s' = (0)

One computer fails. 0.3 s' = (1)

Both computers fail. 0.1

s' = (2)

1 s1 = (1) Remaining computer does not fail and the other is returned.

0.8 s' = (0)

Remaining computer fails and the other is returned.

0.2 s' = (1)

2 s2 = (2) Both computers are returned.

1.0 s' = (2)

Page 187: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

State-Transition Matrix and Network

The events associated with a Markov chain can be described by the m m matrix: P = (pij).

For computer repair example, we have:

001

02.08.0

1.03.06.0

P

Page 188: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

State-Transition Matrix and Network

State-Transition Network

– Node for each state

– Arc from node i to node j if pij > 0.

For computer repair example:

2

0

1

(1)

(0.6)

(0.3)(0.1)

(0.2)

(0.8)

Page 189: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

The n-step transition probabilities :

Let P(n)=[pij(n)] be the matrix of n-step transition probabilities, where pij(n)=P[Xn+k=j/ Xk=i] = P[Xn=j/X0=i] for all n≥0 and k≥0, since the transition probabilities do not depend on time.

Then P(n)=Pn

Page 190: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

Consider P(2)

)1(p)1(p]iX/kX[P]kX/jX[P

]iX[P

]iX[P]iX/kX[P]kX/jX[P

]iX[P

]iX,kX,jX[P]iX/kX,jX[P

kjik0112

0

00112

0

012

012

k

kjikij j,i)1(p)1(p)2(p

Page 191: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

P(2)=P(1)P(1)=P2

P(n)=P(n-1)P

=P(n-2)PP

=P(n-2)P2

=Pn

Hence the n-step transition probability matrix is the nth power of the one-step transition probability matrix.

Page 192: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

Chapman- Kolmogorov Equations:

Interpretation: RHS is the probability of going from i to k in r steps &

then going from k to j in the remaining n r steps, summed over all possible intermediate states k.

k

kjikij )rn(p)r(p)n(p

Page 193: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

State Probabilities:

The state probabilities at time n are given by

the row vector pj(n)={pj(n)}. Now,

Therefore p(n)=p(n-1)P.

Similarly, p(n)=p(0)P(n)=p(0)Pn , n=1,2,…

Hence the state PMF at time n is obtained by

multiplying the initial PMF by Pn.

1npp]iX[P]iX/jX[P)n(p i

i

ij1n

i

1nnj

Page 194: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

Limiting State/ Steady-state Probabilities:

Let {Xn} be a discrete-time Markov chain with N

states

P[Xn=j] - the probability that the process is in

state j at the end of the first n transitions,

j=1,2,..,N.

Then

)(][)(][

1

0 npiXPnpjXP ij

N

k

jn

Page 195: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Steady-state

Probabilities

As n→∞, the n- step transition probability pij(n)

does not depend on i, which means that

P[Xn=j] approaches a constant as n→∞. The limiting- state probabilities are defined as

Since,

NjjXP jnn

,...,2,1,][lim

kj

k

ikij pnpnp 1)(

k

kjkkj

k

ikn

ijn

ppnpnp 1lim)(lim

Page 196: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Steady-state

Probabilities

Defining the steady-state/ limiting state

probability vector , we have

The last equation is due to the law of total probability.

The probability πj is interpreted as the long

proportion of time that the MC spends in state j.

N ,...,, 21

j

j

kj

k

kj

P

p

1

Page 197: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain

Classification of States:

A state j is said to be accessible from state i (j can be reached from i) if, starting from state i, it is possible that the process will ever enter state j.

pij(n)>0 for some n>0.

Two states that are accessible from each other are said to communicate with each other.

Page 198: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

Communication induces a partition of states

States that communicate belong to the same class

All members of a class communicate with each other.

If a class is not accessible from any state outside the class, the class is said to be a closed communicating class.

Page 199: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

A Markov chain in which all the states communicate is called an irreducible Markov Chain.

In an irreducible Markov chain, there is only one class.

Page 200: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

States that the process enters infinitely often and states that the process enters finitely often.

Process will be found in those states that it enters infinitely often

Page 201: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

Probability of first passage from state i to state j in n transitions - fij(n)

The conditional probability that given that the process is in state i, the first time the process enters state j occurs in exactly n transitions.

Page 202: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

Probability of first passage from state i to

state j – fij

Conditional probability that the process will ever enter state j given that it was initially in

state i

1n

ijij nff

Page 203: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

Clearly

and

fii denotes the probability that a process that

starts at state i will ever return to state I

If fii =1, then state i is called a recurrent state.

If fii <1, then state i is called a transient state.

ijij pf )1(

jl

ljilij nfpnf )1()(

Page 204: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

State j is called

• transient (non-recurrent) if there is a positive probability that the process will never return to j again if it leaves j

• recurrent (persistent) if with probability 1, the process will eventually return to j after it leaves j

A set of recurrent states forms a single chain if every member of the set communicates

with all the members of the set.

Page 205: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

Recurrent state j is called a • periodic state if there exists an integer d,

d>1, such that pjj(n) is zero for all values of n other than d, 2d, 3d,…; d is called the period. If d=1, j is called aperiodic.

• positive recurrent state if, starting at state j the expected time until the process returns to state j is finite; otherwise it is called a null-recurrent state.

Page 206: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

Positive recurrent states are called ergodic states

A chain consisting of ergodic states is called an ergodic chain.

A state j is called an absorbing (trapping) state if pij=1. Thus, once the process enters an absorbing/ trapping state, it never leaves the state.

Page 207: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of

States

States

Recurrent/ persistent Transient/ Non - recurrent

Positive recurrent Null Recurrent Periodic Aperiodic

Ergodic Ergodic

Page 208: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Discrete-Time Markov Chain - Classification of States

• If a Markov chain is irreducible, all its states are of the same type. They are either all transient or all null persistent or all non-null persistent.

• Further, all the states are either aperiodic or periodic with the same period.

• If a Markov chain is finite and irreducible, then all its states are non-null persistent.

Page 209: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Continuous-Time Markov Chain

A random process {X(t)/t≥0} is a continuous-time Markov chain if, for all s,t≥0 and nonnegative integers i,j,k,

In a continuous –time Markov chain, the conditional

probability of the future state at time t+s, given the present state at s and all past states depends only on the present state and not on the past.

isXjstXPsukuXisXjstXP )()(]0,)(,)()([

Page 210: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Continuous-Time Markov Chain

If in addition, P[X(t+s)=j/X(s)=i] is independent of s, then the process {X(t),t≥0} is said to be time-homogeneous or have the time-homogeneity property. Time-homogeneous Markov chains have stationary (or homogeneous) transition probabilities.

Let j)t(XP)t(p

i)s(Xj)st(XP)t(p

j

ij

Page 211: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Continuous-Time Markov Chain

In other words, pij(t) is the probability that a MC presently in state i will be in state j after an additional time t and pj(t) is the probability that a MC is in state j at time t.

The transition probabilities satisfy

Further

1)(

1)(0

j

ij

ij

tp

tp

1)( j

j tp

Page 212: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Continuous-Time Markov Chain

Chapman-Kolmogorov equation

sptpstp ks

k

ikij

Page 213: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains 17. A man either drives a car or catches a

train to go to office each day. He never goes 2 days in a row by train, but if he drives one day , then the next day he is just as likely to drive again as he is to travel by train. Now suppose that on the first day of the week, the man tossed a fair die and drove to work if and only if a 6 appeared, find (i) the probability that he takes a train on the third day and (ii) the probability that he drives to work in the long run.

Page 214: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

The travel pattern forms a Markov chain, with

state space = (train,car)

The TPM of the chain is given by

The initial state probability distribution is given

by , since

P(traveling by car)=P(getting a 6 in the toss of the die)= 1/6.

Also, P(traveling by train)= 5/6

2

1

2

110

P

6

1

6

5)0(p

Page 215: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains Now,

Therefore,

P(the man travels by train on the third day) = 11/24

Let π = (π1, π2) be the limiting form of the state probability distribution or stationary

12

11

12

1

2

1

2

110

6

1

6

5)1()2(Ppp

24

13

24

11

2

1

2

110

12

11

12

1)2()3(Ppp

Page 216: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

state distribution of the Markov chain.

By property of π, πP= π

-----(1) and

------(2)

Equations (1) and (2) are the same.

Alongwith the equation π1+ π2 = 1, ---(3)

),(

2

1

2

110

),( 2121

122

1

212

1

Page 217: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

since π is a probability distribution, we obtain

Hence π1=1/3, π2 = 2/3.

Therefore P[man travels by car in the long run]= 2/3

22 12

1

12

32

3

22

3

1

3

211

Page 218: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

18. Three boys A,B and C are throwing a ball to each other. A always throws the ball to B and B always throws the ball to C, but C is just as likely to throw the ball to B as to A. Show that the process is Markovian. Find the transition probability matrix and classify the states.

Solution: Let A,B,C denote the states of the Markov chain.

Page 219: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

The transition probability matrix of {Xn} is given by

Since, the states of Xn depend only on Xn-1 and not on Xn-2, Xn-3, …, the process {Xn} is a Markov chain.

We first observe that the chain is finite.

(Draw the tpm/ network)

02

1

2

1100

010

P

Page 220: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains We observe that all the states

communicate with each other

Hence, the MC is irreducible

Since the MC is finite, all the states are positive recurrent.

Further since state A is aperiodic (WHY???) all the states are aperiodic, ergodic

Page 221: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

19. The transition probability matrix of a Markov chain {Xn} with three states

1,2 and 3 is and the

initial distribution is . Find

(i) (ii) .

Solution: Consider

3.04.03.0

2.02.06.0

4.05.01.0

P)1.0,2.0,7.0()0( p

)1.0,2.0,7.0()0( p

32 XP 2,3,3,2 0123 XXXXP

Page 222: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

.

][ 2)2(pP

3.04.03.0

2.02.06.0

4.05.01.0

3.04.03.0

2.02.06.0

4.05.01.0

29.035.036.0

34.042.024.0

26.031.043.0

32 XP iXPiXXPi

0

3

1

02 3

313123113 002002002 XPXXPXPXXPXPXXP

]3[]2[]1[ 0

2

330

2

230

2

13 XPpXPpXPp

Page 223: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains P[X2=3]

= 0.182 + 0.068 + 0.029 =0.279

(ii)

(conditional probability)

(Markov property)

= 0.4x0.3x0.2x0.2= 0.0048

1.029.02.034.07.026.0

2,3,3,2 0123 XXXXP

2,3,32,3,32 0120123 XXXPXXXXP

2,32,3332 0101223 XXPXXXPXXP

2.. 0

)1(

23

)1(

33

)1(

32 XPppp

Page 224: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

20. A gambler has Rs.2. He bets Re. 1 at a time and wins Re. 1 with probability 0.5. He stops playing if he loses Rs. 2 or wins Rs. 4. What is the transition probability matrix of the related Markov chain?

Solution : HW

Page 225: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

21. There are 2 white marbles in urn A and 3 red marbles in urn B. At each step of the process, a marble is selected from each urn and the 2 marbles selected are interchanged. Let the state ai of the system be the number of red marbles in A after i changes. What is the probability that there are 2 red marbles in A after 3 steps? In the long run, what is the probability that there are 2 red marbles in urn A?

Solution : HW

Page 226: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Markov Chains

22.Find the nature of the states of the Markov chain with the TPM,

P =

Solution : HW

010

2

10

2

1

010

Page 227: RANDOM PROCESSES P… · Random Processes The values assumed by the random variables X(t) are called the states and the ... Wide-sense Stationary Process A random process {X(t)} ...

Thank You