Top Banner
1/1/2021 Stochastic Processes - Ali Aghagolzadeh 1 Let denote the random outcome of an experiment. To every such outcome suppose a waveform is assigned. The collection of such waveforms form a stochastic process. The set of and the time index t can be continuous or discrete (countably infinite or finite) as well. For fixed (the set of all experimental outcomes), is a specific time function. For fixed t, is a random variable. The ensemble of all such realizations over time represents the stochastic ) , ( t X } { k S i ) , ( 1 1 t X X ) , ( t X 14. Stochastic Processes t 1 t 2 t ) , ( n t X ) , ( k t X ) , ( 2 t X ) , ( 1 t X Fig. 14.1 ) , ( t X 0 ) , ( t X Introduction
49

14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

Mar 01, 2021

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 1

Let denote the random outcome of an experiment. To every such

outcome suppose a waveform

is assigned.

The collection of such

waveforms form a

stochastic process. The

set of and the time

index t can be continuous

or discrete (countably

infinite or finite) as well.

For fixed (the set of

all experimental outcomes), is a specific time function.

For fixed t,

is a random variable. The ensemble of all such realizations

over time represents the stochastic

),( tX

}{ k

Si

),( 11 tXX

),( tX

14. Stochastic Processes

t

1t

2t

),(n

tX

),(k

tX

),(2

tX

),(1

tX

Fig. 14.1

),( tX

0

),( tX

Introduction

Page 2: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 2

process X(t). (see Fig 14.1). For example

where is a uniformly distributed random variable in

represents a stochastic process. Stochastic processes are everywhere:

Brownian motion, stock market fluctuations, various queuing systems

all represent stochastic phenomena.

If X(t) is a stochastic process, then for fixed t, X(t) represents

a random variable. Its distribution function is given by

Notice that depends on t, since for a different t, we obtain

a different random variable. Further

represents the first-order probability density function of the

process X(t).

),cos()( 0 tatX

})({),( xtXPtxFX

),( txFX

(14-1)

(14-2)

(0,2 ),

dx

txdFtxf X

X

),(),(

Page 3: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 3

For t = t1 and t = t2, X(t) represents two different random variables

X1 = X(t1) and X2 = X(t2) respectively. Their joint distribution is

given by

and

represents the second-order density function of the process X(t).

Similarly represents the nth order density

function of the process X(t). Complete specification of the stochastic

process X(t) requires the knowledge of

for all and for all n. (an almost impossible task

in reality).

})(,)({),,,( 22112121 xtXxtXPttxxFX

(14-3)

(14-4)

),, ,,,( 2121 nn tttxxxfX

),, ,,,( 2121 nn tttxxxfX

niti , ,2 ,1 ,

2

1 2 1 21 2 1 2

1 2

( , , , )( , , , )

X

X

F x x t tf x x t t

x x

Page 4: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 4

Mean of a Stochastic Process:

represents the mean value of a process X(t). In general, the mean of

a process can depend on the time index t.

Autocorrelation function of a process X(t) is defined as

and it represents the interrelationship between the random variables

X1 = X(t1) and X2 = X(t2) generated from the process X(t).

Properties:

1.

2.

(14-5)

(14-6)

*

1

*

212

*

21 )}]()({[),(),( tXtXEttRttRXXXX

(14-7)

.0}|)({|),( 2 tXEttRXX

(Average instantaneous power)

( ) { ( )} ( , )X

t E X t x f x t dx

* *

1 2 1 2 1 2 1 2 1 2 1 2( , ) { ( ) ( )} ( , , , )XX X

R t t E X t X t x x f x x t t dx dx

Page 5: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 5

3. represents a nonnegative definite function, i.e., for any

set of constants

Eq. (14-8) follows by noticing that

The function

represents the autocovariance function of the process X(t).

Example 14.1

Let

Then

.)(for 0}|{|1

2

n

iii tXaYYE

)()(),(),( 2

*

12121 ttttRttCXXXXXX

(14-9)

.)(

T

TdttXz

T

T

T

T

T

T

T

T

dtdtttR

dtdttXtXEzE

XX

2121

212

*

1

2

),(

)}()({]|[|

(14-10)

n

iia 1}{

),( 21 ttRXX

n

i

n

jjiji ttRaa

XX

1 1

* .0),( (14-8)

Page 6: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 6

Similarly

,0}{sinsin}{coscos

)}{cos()}({)(

0 0

0

EtaEta

taEtXEtX

).(cos2

)}2)(cos()({cos2

)}cos(){cos(),(

210

2

210210

2

2010

2

21

tta

ttttEa

ttEattRXX

(14-12)

(14-13)

Example 14.2

).2,0(~ ),cos()( 0 UtatX (14-11)

This gives

2

0 }.{sin0cos}{cos since

21 EdE

Page 7: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 7

Stationary Stochastic ProcessesStationary processes exhibit statistical properties that are

invariant to shift in the time index. Thus, for example, second-order

stationarity implies that the statistical properties of the pairs

{X(t1) , X(t2) } and {X(t1+c) , X(t2+c)} are the same for any c.

Similarly first-order stationarity implies that the statistical properties

of X(ti) and X(ti+c) are the same for any c.

In strict terms, the statistical properties are governed by the

joint probability density function. Hence a process is nth-order

Strict-Sense Stationary (S.S.S) if

for any c, where the left side represents the joint density function of

the random variables and

the right side corresponds to the joint density function of the random

variables

A process X(t) is said to be strict-sense stationary if (14-14) is

true for all

),, ,,,(),, ,,,( 21212121 ctctctxxxftttxxxf nnnn XX

(14-14)

)( , ),( ),( 2211 nn tXXtXXtXX

).( , ),( ),( 2211 ctXXctXXctXX nn

. and ,2 ,1 , , ,2 ,1 , canynniti

Page 8: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 8

For a first-order strict sense stationary process,

from (14-14) we have

for any c. In particular c = – t gives

i.e., the first-order density of X(t) is independent of t. In that case

Similarly, for a second-order strict-sense stationary process

we have from (14-14)

for any c. For c = – t2 we get

),(),( ctxftxfXX

(14-16)

(14-15)

(14-17)

)(),( xftxfXX

[ ( )] ( ) , E X t x f x dx a constant.

), ,,(), ,,( 21212121 ctctxxfttxxfXX

) ,,(), ,,( 21212121 ttxxfttxxfXX

(14-18)

Page 9: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 9

i.e., the second order density function of a strict sense stationary

process depends only on the difference of the time indices

In that case the autocorrelation function is given by

i.e., the autocorrelation function of a second order strict-sense

stationary process depends only on the difference of the time

indices

Notice that (14-17) and (14-19) are consequences of the stochastic

process being first and second-order strict sense stationary.

On the other hand, the basic conditions for the first and second order

stationarity – Eqs. (14-16) and (14-18) – are usually difficult to verify.

In that case, we often resort to a looser definition of stationarity,

known as Wide-Sense Stationarity (W.S.S), by making use of

.21 tt

.21 tt

(14-19)

*

1 2 1 2

*

1 2 1 2 1 2 1 2

*

1 2

( , ) { ( ) ( )}

( , , )

( ) ( ) ( ),

XX

X

XX XX XX

R t t E X t X t

x x f x x t t dx dx

R t t R R

Page 10: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 10

(14-17) and (14-19) as the necessary conditions. Thus, a process X(t)

is said to be Wide-Sense Stationary if

(i)

and

(ii)

i.e., for wide-sense stationary processes, the mean is a constant and

the autocorrelation function depends only on the difference between

the time indices. Notice that (14-20)-(14-21) does not say anything

about the nature of the probability density functions, and instead deal

with the average behavior of the process. Since (14-20)-(14-21)

follow from (14-16) and (14-18), strict-sense stationarity always

implies wide-sense stationarity. However, the converse is not true in

general, the only exception being the Gaussian process.

This follows, since if X(t) is a Gaussian process, then by definition

are jointly Gaussian random

variables for any whose joint characteristic function

is given by

)}({ tXE

(14-21)

(14-20)

),()}()({ 212

*

1 ttRtXtXEXX

)( , ),( ),( 2211 nn tXXtXXtXX

nttt ,, 21

Page 11: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 11

where is as defined on (14-9). If X(t) is wide-sense

stationary, then using (14-20)-(14-21) in (14-22) we get

and hence if the set of time indices are shifted by a constant c to

generate a new set of jointly Gaussian random variables

then their joint characteristic

function is identical to (14-23). Thus the set of random variables

and have the same joint probability distribution for all n and

all c, establishing the strict sense stationarity of Gaussian processes

from its wide-sense stationarity.

To summarize if X(t) is a Gaussian process, then

wide-sense stationarity (w.s.s) strict-sense stationarity (s.s.s).

Notice that since the joint p.d.f of Gaussian random variables depends

only on their second order statistics, which is also the basis

),( ki ttCXX

1 ,

( ) ( , ) / 2

1 2( , , , )XX

n n

k k i k i k

k l k

X

j t C t t

n e

(14-22)

12

1 1 1 1

( )

1 2( , , , )XX

n n n

k i k i k

k k

X

j C t t

n e

(14-23)

n

iiX 1}{ n

iiX 1}{

),( 11 ctXX

)(,),( 22 ctXXctXX nn

Page 12: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 12

for wide sense stationarity, we obtain strict sense stationarity as well.

From (14-12)-(14-13), (refer to Example 14.2), the process

in (14-11) is wide-sense stationary, but

not strict-sense stationary.

Similarly if X(t) is a zero mean wide

sense stationary process in Example 14.1,

then in (14-10) reduces to

As t1, t2 varies from –T to +T, varies

from –2T to + 2T. Moreover is a constant

over the shaded region in Fig 14.2, whose area is given by

and hence the above integral reduces to

),cos()( 0 tatX

2

z

.)(}|{|

2121

22

T

T

T

Tz dtdtttRzEXX

21 tt

)(XX

R

)0(

dTdTT )2()2(2

1)2(

2

1 22

.)1)((2|)|2)((2

2 2

||2

2

2

T

TT

T

Tz dRTdTR

XXXX

(14-24)

T T

T

T2

2t

1t

Fig. 14.2

21tt

Page 13: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 13

Systems with Stochastic InputsA deterministic system1 transforms each input waveform into

an output waveform by operating only on the

time variable t. Thus a set of realizations at the input corresponding

to a process X(t) generates a new set of realizations at the

output associated with a new process Y(t).

),( itX )],([),( ii tXTtY

)},({ tY

Our goal is to study the output process statistics in terms of the input

process statistics and the system function.

1A stochastic system on the other hand operates on both the variables t and .

][T )(tX )(tY

t t

),(i

tX ),(

itY

Fig. 14.3

Page 14: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 14

Deterministic Systems

Systems with Memory

Time-Invariant

systems

Linear systems

Linear-Time Invariant

(LTI) systems

Memoryless Systems

)]([)( tXgtY

)]([)( tXLtY Time-varying

systemsFig. 14.3

.)()(

)()()(

dtXh

dXthtY( )h t( )X t

LTI system

Page 15: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 15

Memoryless Systems:The output Y(t) in this case depends only on the present value of the

input X(t). i.e., (14-25))}({)( tXgtY

Memoryless

system

Memoryless

system

Memoryless

system

Strict-sense

stationary input

Wide-sense

stationary input

X(t) stationary

Gaussian with)(

XXR

Strict-sense

stationary output.

Need not be

stationary in

any sense.

Y(t) stationary,but

not Gaussian with

(see (14-26)).

).()( XXXY

RR

(see (9-76), Text for a proof.)

Fig. 14.4

Page 16: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 16

Theorem: If X(t) is a zero mean stationary Gaussian process, and

Y(t) = g[X(t)], where represents a nonlinear memoryless device,

then

Proof:

where are jointly Gaussian random

variables, and hence

)(g

)}.({ ),()( XgERRXXXY

(14-26)

212121 ),()(

)}]({)([)}()({)(

21dxdxxxfxgx

tXgtXEtYtXER

XX

XY

(14-27)

)( ),( 21 tXXtXX

* 1

1 2

/ 2

1 2

1 2 1 2

* *

12 | |

(0) ( )

( ) (0)

( , )

( , ) , ( , )

{ } XX XX

XX XX

X X

x A x

T T

A

R R

R R

f x x e

X X X x x x

A E X X LL

Page 17: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 17

where L is an upper triangular factor matrix with positive diagonal

entries. i.e.,

Consider the transformation

so that

and hence Z1, Z2 are zero mean independent Gaussian random

variables. Also

and hence

The Jacobaian of the transformation is given by

. 0

22

1211

l

llL

IALLLXXELZZE 11 *1**1*

}{}{

* * *1 * 1 2 2

1 2 .x A x z L A Lz z z z z

22222121111 , zlxzlzlxzLx

1 1 1 2 1 2( , ) , ( , )T TZ L X Z Z z L x z z

Page 18: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 18

Hence substituting these into (14-27), we obtain

where This gives

.|||||| 2/11 ALJ

2 21 2

1/ 211 1 12 2 22 2

11 1 22 2 1 21 2

12 2 22 2 1 21 2

/ 2 / 21 1| | 2 | |

1 2

1 2

( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

XY J A

z z

z z

z zR l z l z g l z e e

l z g l z f z f z dz dz

l z g l z f z f z dz dz

22

212 22

22

11 1 1 22 2 21 2

12 2 22 2 22

/ 2

2

2

1 2

2

1

2

/ 21

2

( ) ( ) ( )

( ) ( )

( ) ,

z z

z

z

u lll

l z f z dz g l z f z dz

l z g l z f z dz

e

ug u e du

0

22 2 .u l z

Page 19: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 19

2

22

222 22

2

2

( )

/ 2112 22 2

( )( )

( ) ( )

( ) ( ) ( ) ,

u

XY

uu

XX

f u

u

df uf u

du

u

l

l

ul

R l l g u e du

R g u f u du

Hence).( gives since 2212

* XX

RllLLA

the desired result, where Thus if the input to

a memoryless device is stationary Gaussian, the cross correlation

function between the input and the output is proportional to the

input autocorrelation function.

),()}({)(

})()(|)()(){()(

XXXX

XXXY

RXgER

duufugufugRR uu

0

)].([ XgE

Page 20: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 20

Linear Systems: represents a linear system if

Let

represent the output of a linear system.

Time-Invariant System: represents a time-invariant system if

i.e., shift in the input results in the same shift in the output also.

If satisfies both (14-28) and (14-30), then it corresponds to

a linear time-invariant (LTI) system.

LTI systems can be uniquely represented in terms of their output to

a delta function

][L

)}({)( tXLtY

)}.({)}({)}()({ 22112211 tXLatXLatXatXaL (14-28)

][L

)()}({)}({)( 00 ttYttXLtXLtY

(14-29)

(14-30)

][L

LTI)(t )(th

Impulse

Impulse

response of

the system

t

)(th

Impulse

responseFig. 14.5

Page 21: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 21

Eq. (14-31) follows by expressing X(t) as

and applying (14-28) and (14-30) to Thus)}.({)( tXLtY

)()()( dtXtX

(14-31)

(14-32)

(14-33).)()()()(

)}({)(

})()({

})()({)}({)(

dtXhdthX

dtLX

dtXL

dtXLtXLtY

By Linearity

By Time-invariance

then

LTI

)()(

)()()(

dtXh

dXthtYarbitrary

input

t

)(tX

t

)(tY

Fig. 14.6

)(tX )(tY

Page 22: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 22

Output Statistics: Using (14-33), the mean of the output process

is given by

Similarly the cross-correlation function between the input and output

processes is given by

Finally the output autocorrelation function is given by

).()()()(

})()({)}({)(

thtdth

dthXEtYEt

XX

Y

(14-34)

).(),(

)(),(

)()}()({

})()()({

)}()({),(

2

*

21

*

21

*

21

*

21

2

*

121

thttR

dhttR

dhtXtXE

dhtXtXE

tYtXEttR

XX

XX

XY

*

*

(14-35)

Page 23: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 23

or

),(),(

)(),(

)()}()({

})( )()({

)}()({),(

121

21

21

2

*

1

2

*

121

thttR

dhttR

dhtYtXE

tYdhtXE

tYtYEttR

XY

XY

YY

*

).()(),(),( 12

*

2121 ththttRttRXXYY

(14-36)

(14-37)

h(t))(tX

)(tY

h*(t2) h(t1) ),( 21 ttRXY ),( 21 ttR

YY),( 21 ttR

XX

(a)

(b)

Fig. 14.7

Page 24: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 24

In particular if X(t) is wide-sense stationary, then we have

so that from (14-34)

Also so that (14-35) reduces to

Thus X(t) and Y(t) are jointly w.s.s. Further, from (14-36), the output

autocorrelation simplifies to

From (14-37), we obtain

XXt )(

constant.a cdhtXXY

,)()(

(14-38)

)(),( 2121 ttRttRXXXX

(14-39)

).()()(

,)()(),( 21

2121

YYXY

XYYY

RhR

ttdhttRttR

(14-40)

).()()()( * hhRRXXYY

(14-41)

. ),()()(

)()(),(

21

*

*

2121

ttRhR

dhttRttR

XYXX

XXXY

Page 25: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 25

From (14-38)-(14-40), the output process is also wide-sense stationary.

This gives rise to the following representation

LTI system

h(t)

Linear system

wide-sense

stationary process

strict-sense

stationary process

Gaussian

process (also

stationary)

wide-sense

stationary process.

strict-sense

stationary process

(see Text for proof )

Gaussian process

(also stationary)

)(tX )(tY

LTI system

h(t)

)(tX

)(tX

)(tY

)(tY

(a)

(b)

(c)

Fig. 14.8

Page 26: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 26

White Noise Process:W(t) is said to be a white noise process if

i.e., E[W(t1) W*(t2)] = 0 unless t1 = t2.

W(t) is said to be wide-sense stationary (w.s.s) white noise

if E[W(t)] = constant, and

If W(t) is also a Gaussian process (white Gaussian process), then all of

its samples are independent random variables (why?).

For w.s.s. white noise input W(t), we have

),()(),( 21121 tttqttRWW

(14-42)

).()(),( 2121 qttqttRWW

(14-43)

White noise

W(t)

LTI

h(t)

Colored noise

( ) ( ) ( )N t h t W t

Fig. 14.9

Page 27: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 27

and

where

Thus the output of a white noise process through an LTI system

represents a (colored) noise process.

Note: White noise need not be Gaussian.

“White” and “Gaussian” are two different concepts!

)()()(

)()()()(

*

*

qhqh

hhqRnn

(14-45)

.)()()()()(

**

dhhhh (14-46)

(14-44)

[ ( )] ( ) ,

WE N t h d

a constant

Page 28: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 28

Upcrossings and Downcrossings of a stationary Gaussian process:

Consider a zero mean stationary Gaussian process X(t) with

autocorrelation function An upcrossing over the mean value

occurs whenever the realization X(t)

passes through zero with

positive slope. Let

represent the probability

of such an upcrossing in

the interval

We wish to determine

Since X(t) is a stationary Gaussian process, its derivative process

is also zero mean stationary Gaussian with autocorrelation function

(see (9-101)-(9-106), Text). Further X(t) and

are jointly Gaussian stationary processes, and since (see (9-106), Text)

).(XX

R

t

). ,( ttt .

Fig. 14.10

)(tX

)()( XXXX

RR

)(tX

,)(

)(

d

dRR XX

XX

Upcrossings

t

)(tX

Downcrossing

The derivative of an even function is an odd function and vice versa

Page 29: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 29

we have

which for gives

i.e., the jointly Gaussian zero mean random variables

are uncorrelated and hence independent with variances

respectively. Thus

To determine the probability of upcrossing rate,

0

)()(

)(

)()(

XX

XXXX

XXR

d

dR

d

dRR

(14-48)

(14-47)

(0) 0 [ ( ) ( )] 0XX

R E X t X t

)( and )( 21 tXXtXX (14-49)

,

0 )0( )0( and )0( 2

2

2

1 XXXXXX

RRR (14-50)

2 21 1

2 21 2

1 2 1 2 1 2

1 2

2 21( , ) ( ) ( ) .

2X X X X

x x

f x x f x f x e

(14-51)

Page 30: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 30

we argue as follows: In an interval the realization moves

from X(t) = X1 to

and hence the realization intersects with the zero level somewhere

in that interval if

i.e.,

Hence the probability of upcrossing

in is given by

Differentiating both sides of (14-53) with respect to we get

and letting Eq. (14-54) reduce to

), ,( ttt

,)()()( 21 tXXttXtXttX

1 2 .X X t

(14-52)

) ,( ttt

(14-53)

t

)(tX

)(tX

)( ttX

ttt

Fig. 14.11

.)()(

),(

21

0

1

0 2

0 21

0

21

212

2 2121

dxxdxfxf

dxxdxxft

tx

x txx

XX

XX

,t

(14-54)2 1

2 2 2 2 0( ) ( )

X Xf x x f x t dx

,0t

1 2 1 20, 0, and ( ) 0 X X X t t X X t

Page 31: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 31

[where we have made use of (5-78), Text]. There is an equal

probability for downcrossings, and hence the total probability for

crossing the zero line in an interval equals where

It follows that in a long interval T, there will be approximately

crossings of the mean value. If is large, then the

autocorrelation function decays more rapidly as moves

away from zero, implying a large random variation around the origin

(mean value) for X(t), and the likelihood of zero crossings should

increase with increase in agreeing with (14-56).

)0(

)0(

2

1)/2(

2

1

)0(2

1

)()0(2

1)0()(

2

0 222

0 222 12

XX

XX

XX

X

XX

XX

R

R

R

dxxfxR

dxfxfx

(14-55)

) ,( ttt ,0

t

.0 )0(/)0(1

0

XXXXRR

(14-56)

T0

)0( XX

R )(

XXR

(0),XX

R

Page 32: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 32

Discrete Time Stochastic Processes:

A discrete time stochastic process Xn = X(nT) is a sequence of

random variables. The mean, autocorrelation and auto-covariance

functions of a discrete-time process are gives by

and

respectively. As before strict sense stationarity and wide-sense

stationarity definitions apply here also.

For example, X(nT) is wide sense stationary if

and

)}()({),(

)}({

2

*

121 TnXTnXEnnR

nTXEn

*

2121 21),(),( nnnnRnnC

(14-57)

(14-58)

(14-59)

constanta nTXE ,)}({ (14-60)

(14-61)* *[ {( ) } {( ) }] ( ) n nE X k n T X k T R n r r

Page 33: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 33

i.e., R(n1, n2) = R(n1 – n2) = R*(n2 – n1). The positive-definite

property of the autocorrelation sequence in (14-8) can be expressed

in terms of certain Hermitian-Toeplitz matrices as follows:

Theorem: A sequence forms an autocorrelation sequence of

a wide sense stationary stochastic process if and only if every

Hermitian-Toeplitz matrix Tn given by

is non-negative (positive) definite for

Proof: Let represent an arbitrary constant vector.

Then from (14-62),

since the Toeplitz character gives Using (14-61),

Eq. (14-63) reduces to

}{ nr

0, 1, 2, , .n

*

0

*

1

*

1

*

110

*

1

210

n

nn

n

n

n T

rrrr

rrrr

rrrr

T

T

naaaa ] , , ,[ 10

(14-62)

n

i

n

kikkin raaaTa

0 0

**(14-63)

.)( , ikkin rT

Page 34: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 34

From (14-64), if X(nT) is a wide sense stationary stochastic process

then Tn is a non-negative definite matrix for every

Similarly the converse also follows from (14-64). (see section 9.4, Text)

If X(nT) represents a wide-sense stationary input to a discrete-time

system {h(nT)}, and Y(nT) the system output, then as before the cross

correlation function satisfies

and the output autocorrelation function is given by

or

Thus wide-sense stationarity from input to output is preserved

for discrete-time systems also.

.,,2 ,1 ,0 n

(14-64)

2

* * * *

0 0 0

{ ( ) ( )} ( ) 0.n n n

n i k ki k k

a T a a a E X kT X iT E a X kT

)()()( * nhnRnRXXXY

)()()( nhnRnRXYYY

).()()()( * nhnhnRnRXXYY

(14-65)

(14-66)

(14-67)

Page 35: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 35

Auto Regressive Moving Average (ARMA) Processes

Consider an input – output representation

where X(n) may be considered as the output of a system {h(n)}

driven by the input W(n).

Z – transform of

(14-68) gives

or

,)()()(01

q

kk

p

kk knWbknXanX (14-68)

(14-69)

h(n)W(n) X(n)

00 0

( ) ( ) , 1p q

k k

k kk k

X z a z W z b z a

1 2

0 1 2

1 20 1 2

( ) ( )( ) ( )

( ) ( )1

q

qk

pk p

b b z b z b zX z B zH z h k z

W z A za z a z a z

(14-70)

Fig.14.12

Page 36: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 36

represents the transfer function of the associated system response {h(n)}

in Fig 14.12 so that

Notice that the transfer function H(z) in (14-70) is rational with p poles

and q zeros that determine the model order of the underlying system.

From (14-68), the output undergoes regression over p of its previous

values and at the same time a moving average based on

of the input over (q + 1) values is added to it, thus

generating an Auto Regressive Moving Average (ARMA (p, q))

process X(n). Generally the input {W(n)} represents a sequence of

uncorrelated random variables of zero mean and constant variance

so that

If in addition, {W(n)} is normally distributed then the output {X(n)}

also represents a strict-sense stationary normal process.

If q = 0, then (14-68) represents an AR(p) process (all-pole

process), and if p = 0, then (14-68) represents an MA(q)

(14-72)

(14-71).)()()(0

k

kWknhnX

),1( ),( nWnW

2

W

).()( 2 nnRWWW

)( , qnW

Page 37: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 37

process (all-zero process). Next, we shall discuss AR(1) and AR(2)

processes through explicit calculations.

AR(1) process: An AR(1) process has the form (see (14-68))

and from (14-70) the corresponding system transfer

provided | a | < 1. Thus

represents the impulse response of an AR(1) stable system. Using

(14-67) together with (14-72) and (14-75), we get the output

autocorrelation sequence of an AR(1) process to be

)()1()( nWnaXnX (14-73)

1|| ,)( aanh n (14-75)

(14-74)

011

1)(

n

nn zaaz

zH

2

||2

0

||22

1}{}{)()(

a

aaaaannR

n

k

kknnn

WWWXX

(14-76)

Page 38: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 38

where we have made use of the discrete version of (14-46). The

normalized (in terms of RXX (0)) output autocorrelation sequence is

given by

It is instructive to compare an AR(1) model discussed above by

superimposing a random component to it, which may be an error

term associated with observing a first order AR process X(n). Thus

where X(n) ~ AR(1) as in (14-73), and V(n) is an uncorrelated random

sequence with zero mean and variance that is also uncorrelated

with {W(n)}. From (14-73), (14-78) we obtain the output

autocorrelation of the observed process Y(n) to be

)()()( nVnXnY

.0 || ,)0(

)()( || na

R

nRn n

XX

XX

X

(14-78)

(14-77)

2

V

)(1

)()()()()(

2

2

||2

2

na

a

nnRnRnRnR

VW

VXXVVXXYY

n

(14-79)

Page 39: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 39

so that its normalized version is given by

where

Eqs. (14-77) and (14-80) demonstrate the effect of superimposing

an error sequence on an AR(1) model. For non-zero lags, n ≥1, the

autocorrelation of the observed sequence {Y(n)}is reduced by a constant

factor compared to the original process {X(n)}.

From (14-78), the superimposed

error sequence V(n) only affects

the corresponding term in Y(n)

(term by term). However,

a particular term in the “input sequence”

W(n) affects X(n) and Y(n) as well as

all subsequent observations.

(14-80)

.1)1( 222

2

a

cVW

W

(14-81)

Fig. 14.13

nk

)()( kkYX

1)0()0( YX

0

| |

1 0( )( )

(0) 1, 2,

YY

Y

YY

n

nR nn

R c a n

Page 40: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 40

AR(2) Process: An AR(2) process has the form

and from (14-70) the corresponding transfer function is given by

so that

and in term of the poles of the transfer function,

from (14-83) we have

that represents the impulse response of the system.

From (14-84)-(14-85), we also have

From (14-83),

)()2()1()( 21 nWnXanXanX (14-82)

(14-83)

(14-84)

(14-85)

1

2

2

1

1

1

02

2

1

1 111

1)()(

z

b

z

b

zazaznhzH

n

n

2 ),2()1()( ,)1( ,1)0( 211 nnhanhanhahh

0 ,)( 2211 nbbnh nn

. ,1 1221121 abbbb

, , 221121 aa (14-86)

21 and

Page 41: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 41

and H(z) stable implies

Further, using (14-82) the output autocorrelations satisfy the recursion

and hence their normalized version is given by

By direct calculation using (14-67), the output autocorrelations are

given by

(14-88)

(14-87)

.1|| ,1|| 21

)2()1(

)}()({

)}()]2()1({[

)}()({)(

21

*

*

21

*

nRanRa

mXmnWE

mXmnXamnXaE

mXmnXEnR

XXXX

XX

0

2

2

*

2

2

2

*

21

*

2

*

21

2

*

1

*

12

*

1

2

1

*

1

2

12

0

*2

*2*

||1

)(||

1

)(

1

)(

||1

)(||

)()(

)()()()()()(

nnnn

k

bbbbbb

khknh

nhnhnhnhnRnR

W

W

WWWXX

(14-89)

1 2

( )( ) ( 1) ( 2).

(0)

XX

X X X

XX

R nn a n a n

R

Page 42: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 42

where we have made use of (14-85). From (14-89), the normalized

output autocorrelations may be expressed as

where c1 and c2 are appropriate constants.

Damped Exponentials: When the second order system in

(14-83)-(14-85) is real and corresponds to a damped exponential

response, the poles are complex conjugate which gives

in (14-83). Thus

In that case in (14-90) so that the normalized

correlations there reduce to

But from (14-86)

(14-90)nn

XX

XX

Xcc

R

nRn *

22

*

11)0(

)()(

2

1 24 0a a

* 1 2

jc c c e

* 1 2 1, , 1.jr e r

(14-91)

(14-92)).cos(2}Re{2)( *

11 ncrcn nn

X

,1 ,cos2 2

2

121 arar (14-93)

Page 43: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 43

and hence which gives

Also from (14-88)

so that

where the later form is obtained from (14-92) with n = 1. But

in (14-92) gives

Substituting (14-96) into (14-92) and (14-95) we obtain the normalized

output autocorrelations to be

2

1 22 sin ( 4 ) 0r a a

1)0( X

.)4(

tan1

2

2

1

a

aa (14-94)

(14-95)

(14-96)

)1()1()0()1( 2121 XXXXaaaa

)cos(21

)1(2

1

cra

aX

.cos2/1or ,1cos2 cc

Page 44: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 44

where satisfies

Thus the normalized autocorrelations of a damped second order

system with real coefficients subject to random uncorrelated

impulses satisfy (14-97).

More on ARMA processes

From (14-70) an ARMA (p, q) system has only p + q + 1 independent

coefficients, and hence its impulse

response sequence {hk} also must exhibit a similar dependence among

them. In fact according to P. Dienes (The Taylor series, 1931),

.1

1cos

)cos(

22

1

aa

a

(14-98)

1 ,cos

)cos()()( 2

2/

2

an

an n

X

(14-97)

( , 1 , , 0 ),k ia k p b i q

Page 45: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 45

an old result due to Kronecker1 (1881) states that the necessary and

sufficient condition for to represent a rational

system (ARMA) is that

where

i.e., In the case of rational systems for all sufficiently large n, the

Hankel matrices Hn in (14-100) all have the same rank.

The necessary part easily follows from (14-70) by cross multiplying

and equating coefficients of like powers of

1Among other things “God created the integers and the rest is the work of man.” (Leopold Kronecker)

0( ) k

kkH z h z

det 0, (for all sufficiently large ),nH n N n (14-99)

(14-100)

, 0, 1, 2, .kz k

0 1 2

1 2 3 1

1 2 2

.

n

n

n

n n n n

h h h h

h h h hH

h h h h

Page 46: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 46

This gives

For systems with

in (14-102) we get

which gives det Hp = 0. Similarly gives 1,i p q

0 0

1 0 1 1

0 1 1

0 1 1 1 10 , 1.

q q q m

q i q i q i q i

b h

b h a h

b h a h a h

h a h a h a h i

(14-102)

(14-101)

1, letting , 1, , 2q p i p q p q p q

0 1 1 1 1

1 1 2 1 1 2

0

0

p p p p

p p p p p p

h a h a h a h

h a h a h a h

(14-103)

Page 47: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 47

and that gives det Hp+1 = 0 etc. (Notice that )

(For sufficiency proof, see Dienes.)

It is possible to obtain similar determinantial conditions for ARMA

systems in terms of Hankel matrices generated from its output

autocorrelation sequence.

Referring back to the ARMA (p, q) model in (14-68),

the input white noise process w(n) there is uncorrelated with its own

past sample values as well as the past values of the system output.

This gives

0, 1, 2,p ka k

0 1 1 1

1 1 2 2

1 1 2 2 2

0

0

0,

p p p

p p p

p p p p p

h a h a h

h a h a h

h a h a h

(14-104)

*{ ( ) ( )} 0, 1E w n w n k k

*{ ( ) ( )} 0, 1.E w n x n k k

(14-105)

(14-106)

Page 48: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 48

Together with (14-68), we obtain

and hence in general

and

Notice that (14-109) is the same as (14-102) with {hk} replaced

*

* *

1 0

*

1 0

{ ( ) ( )}

{ ( ) ( )} { ( ) ( )}

{ ( ) ( )}

i

p q

k kk k

p q

k i k kk k

r E x n x n i

a x n k x n i b w n k w n i

a r b w n k x n i

(14-107)

1

0,p

k i k ik

a r r i q

(14-108)

1

0, 1.p

k i k ik

a r r i q

(14-109)

Page 49: 14. Stochastic Processes · 2021. 1. 9. · process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic

1/1/2021 Stochastic Processes - Ali Aghagolzadeh 49

by {rk} and hence the Kronecker conditions for rational systems can

be expressed in terms of its output autocorrelations as well.

Thus if X(n) ~ ARMA (p, q) represents a wide sense stationary

stochastic process, then its output autocorrelation sequence {rk}

satisfies

where

represents the Hankel matrix generated from

It follows that for ARMA (p, q) systems, we have

det 0, for all sufficiently large .nD n (14-112)

1rank rank , 0,p p kD D p k (14-110)

(14-111)

( 1) ( 1)k k

0 1 2, , , , , .k kr r r r

0 1 2

1 2 3 1

1 2 2

k

k

k

k k k k

r r r r

r r r rD

r r r r