Top Banner
Random Process ept of random variable was defined previously as mappin pace S to the real line as shown below Sam ple Space S 2 n s 1 n s n s 1 n s 1 n x 2 n x n x 1 n x
32

Random Process

Jan 05, 2016

Download

Documents

Cassie

Random Process. The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below. The concept of random process can be extended to include time and the outcome will be random functions of time as shown below. The functions. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Random Process

Random Process

The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below

Sample Space

S

2ns 1ns

ns

1ns

1nx

2nx

nx

1nx

Page 2: Random Process

The concept of random process can be extended to include time and the outcome will be random functions of time as shown below

The functions 2 1 1( ), ( ), ( ), ( ),n n n nx t x t x t x t are one realizations

of many of the random process X(t)

A random process also represents a random variable when time is fixed

1X(t ) is a random variable

Page 3: Random Process

The random process X(t) can be classified as follows:

Stationary and Independence

First-order stationary

A random process is classified as first-order stationary if its first-order probability density function remains equal regardless of any shift in time to its time origin.

If we Xt1let represent a given value at time t1

then we define a first-order stationary as one that satisfies the following equation :

X t1 X t1f (x ) = f (x + τ)

The physical significance of this equation is that our density function ,

X t1f (x )

is completely independent of t1

and thus any time shift

For first-order stationary the mean is a constant, independent of any time shift

Page 4: Random Process

Second-order stationary

A random process is classified as second-order stationary if its second-order probability density function does not vary over any time shift applied to both values .

In other words, for values Xt1 and Xt2 then we will have the followingbe equal for an arbitrary time shift

X t1 t2 X t1+τ t2+τf (x ,x ) = f (x ,x )

From this equation we see that the absolute time does not affect our functions, rather it only really depends on the time difference between the two variables .

Page 5: Random Process

For a second-order stationary process, we need to look at the autocorrelation function ( will be presented later) to see its most

important property .

Since we have already stated that a second-order stationary process depends only on the time difference, then all of these types of processes

have the following property :

XX

XX

R (t,t+τ) = E[X(t)X(t+τ)]

= R (τ)

Page 6: Random Process

Wide-Sense Stationary (WSS)

A process that satisfies the following:

E X(t) = X = constant

XXE X(t)X(t + τ) = R (τ)

is a Wide-Sense Stationary (WSS)

Second-order stationary Wide-Sense Stationary

The converse is not true in general

Page 7: Random Process

Time Average and Ergodicity

An attribute (سمة ) of stochastic systems; generally, a system that tends in probability to a limitingform that is independent of the initial condititions

Ergodicity

The time average of a quantity is defined as

T

TT

1A = lim dt

2T

Here A is used to denote time average in a manner analogous to E for the statistical average.

The time average is taken over all time because, as applied to random processes, sample functions of processes are presumed to exist for all time.

Page 8: Random Process

Let x(t) be a sample of the random process X(t) were the lower case letter imply a sample function (not random function).

X(t) the random process

x(t) a sample of the random process the

random process

Page 9: Random Process

Let x(t) be a sample of the random process X(t) were the lower case letter imply a sample function.

We define the mean value x = A x(t)

)a lowercase letter is used to imply a sample function( and the time autocorrelation function XX (τ) as follows:

T

TT

1x = A x(t) = lim x(t) dt

2T

XX (τ) = A x(t)x(t + τ) T

TT

1= lim x(t)x(t + τ) dt

2T

For any one sample function ( i.e., x(t) ) of the random process X(t) ,the last two integrals simply produce two numbers .

xA number for the average

XX (τ) for a specific value of and a number for

Page 10: Random Process

Since the sample function x(t) is one out of other samples functions of the random process X(t) ,

The average xXX (τ)and the autocorrelation

are actually random variables

By taking the expected value for x XX (τ) and ,we obtain

T

TT

1E[x] = E[A[x(t)]] = E lim x(t) dt

2T

T

TT

1lim E[x(t)] dt

2T

T

TT

1lim X dt

2T T

= lim X(1)

= X

T

XX TT

1E[ (τ)] = E [A[x(t)x(t + τ)] ] = E lim x(t)x(t + τ) dt

2T

T T

XX XXT TT T

1 1= lim E[x(t)x(t + τ)] dt = lim R (τ) dt = R (τ)

2T 2T

Page 11: Random Process

Correlation Function

Autocorrelation Function and Its Properties

The autocorrelation function of a random process X(t) is the correlation

1 2E X X of two random variables 1 1X = X(t ) 2 2X = X(t )and

by the process at times t1 and t2

XX 1 2 1 2R (t ,t ) = E X(t )X(t )

Assuming a second-order stationary process

XXR (t, t + τ) = E X(t)X(t + τ) XXR (τ) = E X(t)X(t + τ)

Page 12: Random Process

XX XX(1) R (τ) R (0) autocorrelation function is bounded by                                 

its value at the origin

XX XX(2) R ( τ) = R (τ) autocorrelation function is an even function

2XX(3) R (0) = E X (t) The average power in the process

2XX

τ

(4) ( ) 0      and is ergodic with no periodic components then

lim R (τ) = X

E X t X X t

XX R (τ)Properties of

Page 13: Random Process

2

4Let X(t) be a stationary ergodic process with autocorrelation function ( ) given as ( ) 25

1 6XX XXR R

Example 6.3 -1

Page 14: Random Process

11

The Covariance of two   and was defined as

= = ( )( ) = ( )( ) ( , ) XY XY

X Y

C E X X Y Y x X y Y f x y dxdy

random variables

Covariance Function

We extend the concept of of covariance to

( , ) ( ) ( ) ( ) ( )XXC t t E X t E X t X t E t

random process

which can be put in the form

( , ) ( , ) ( ) ( )

which is similar to the two random variables case

XX XXC t t R t t E X t E X t

2XX XXFor WSS stationary process t , we get   C (τ) = R (τ) XX

2 2 2 2 (0) (0) [ ( )] which is the variance  of tXX XX XC R X E X t X X

Page 15: Random Process

2

4Let X(t) be a stationary ergodic process with autocorrelation function ( ) given as ( ) 25

1 6XX XXR R

Example 6.3 -1

Page 16: Random Process

In the real word , we can never measure the true correlation function of random process

because we never have all samples functions

6.4  Measurement of Correlation Functions

Page 17: Random Process

6.5  Gaussian Random Processes

Let X t be a Gaussian Random Processes as shown

1 1Define N random variables  ( ), , ( ), , ( )i i N NX X t X X t X X t

1

1/21 1

, 1 /2

We call the process a Gaussian if for any N 1,2   the joint density function is given as

( , ) = exp

(2 ) 2

where and are defined

N

tX X

X X N N

X

C Cf x x

C

x X x X

x X next

The Gaussian Random Processes is the most important one used in describing many

physical processes

Page 18: Random Process

1

1/21 1

, 1 /2

f ( , ) = exp

(2 ) 2N

tX X

X X N N

C Cx x

x X x X

1

transpose

matrix inverse

t

[( )( )]ij i i j jC E X X X X

2[( )

i j

i i

X X

E X X i j

C i j

2

i

i j

X

X X

i j

C i j

1 1 1 1

2 2 2 2

11 12 1

21 22 2

1 2

-

- random vector mean vector

-

covariance m

N N N N

N

NX

N N NN

x X x X

x X x X

x X x X

C C C

C C CC

C C C

x X x X

atrix

Joint Gaussian density

Page 19: Random Process

1

1/21 1

, 1 /2

f ( , ) = exp

(2 ) 2N

tX X

X X N N

C Cx x

x X x X

2 21 (one random variable) [ [( ) ]X i i XN x X C E X X x X

1/212 12

1/2

( ) = exp

(2 ) 2

tX X

X

x X x Xf x

2

22

1= exp

22 XX

x X

1 11 12 11 1 1

2 21 22 22 2 2

1 2

-

-

-

N

NX

N N N NNN N N

x C C CX x X

x C C CX x XC

x C C CX x X

x X x X

2 (Two random variables) we have the joint density functionN

Page 20: Random Process

1 0.8413 0.1587

*[ (3) 1]P X

Example

Solution

(1)F

Page 21: Random Process

*[ (3) 1]P X

Example

Solution

(1)F 0.8413

Page 22: Random Process
Page 23: Random Process

Linear System with Random Input

In application of random process, the Input-Output relation through a linear system can be described as follows :

X(t) Y(t)=X(t)*h(t)h(t)

Linear System

Here X(t) is a random process and h(t) (Deterministic Function) is the impulse response of the linear system ( Filter or any other Linear System )

Page 24: Random Process

Linear System with Random Input

X(t) Y(t)=X(t)*h(t)h(t)

Linear System

Now we can look at input output relation as follows:

The Time Domain

The output in the time domain is the convolution of the Input random process X(t) and the impulse response h(t),

Y(t)= X(ξ)h(t ξ)dξ= h(ξ)X(t ξ)dξ

Question: Can you evaluate this convolution integral?

Answer:We can observe that we can not evaluate this convolution integral in general because X(t) is random and there is no mathematical expression for X(t).

Page 25: Random Process

The Frequency Domain

The output in the Frequency Domain is the Product of the Input Fourier Transform of the input random process X(t) , FX(f) and the Fourier Transform of the impulse response h(t), H(f)

H(f)

Linear System

XF (f) Y XF (f) = F (f)H(f)

j2πfXF (f)= X(t) e dt

the Fourier Transform of the input random process X(t)

is a random process

j2πfH(f)= h(t) e dt

the Fourier Transform of the deterministic impulse response

Y YF (f) = F (f)H(f) the Fourier Transform of the output random process Y(t) is a random process

Question : Can you evaluate the Fourier Transform of the input random process X(t) , FX(f)?

Answer: In general no , since the function X(t) in general is random and has no mathematical expression.

Page 26: Random Process

Question: How can we describe then the behavior of the input random process and the output random process through a linear time-invariant system?

X(t) Y(t)=X(t)*h(t)h(t)

Linear System

Page 27: Random Process

We defined previously the autocorrelation XR (τ) as

XR (τ)=E[X(t)X(t + τ)]

The auto correlation tell us how the random process is varying

Is it a slow varying process or a high varying process.

Next we will define another function that will help us on looking at the behavior of the random process

Page 28: Random Process

Let

Sxx(f) ( or Sxx()) be the Fourier Transform of Rxx(t)

XX XX R (τ) S (f)

j2πfτXX XXS (f) = R (τ)e dτ

jωτ

XX XXS (ω) = R (τ)e dτ

OR

j2πfτXX XXR (τ) = S (f)e df

jωτ

XX XX

1R (τ) = S (ω)e dω

OR

XX XXR (0) = S (f)df

2= E[X(t)X(t)]=E[X (t)]

Since Average Power

Then SXX(f) is Power Spectral Density ( PSD ) of the Random Process X(t)

Page 29: Random Process

autocorrelation Power Spectral Density

j2πfτ j2πfτXX XX XX X

( P

X

SD )

R (τ) = S (f)e df S (f) = R (τ)e dτ

Properties of PSD

XX(1) S (f ) 0 for all f ( Power never negative)

XX XX XX(2) S ( f) = S (f) Even Function ( X(t) real R (τ) is real)

XX XX(3) S (f) is real since R (τ) is even function

2XX XX(4) R (0) = S (f)df E[X (t)] Total Power

XX XX(5) S (0) = R (τ)dτ DC power ( power at zero Frequency)

Page 30: Random Process

Now let us look at the input-output linear system shown below in the time domain and Frequency domain assuming the Random Process X(t) is WSS

Time Domain

None random Deterministic Function

X(t) Y(t)=X(t)*h(t)h(t)

Linear SystemXXR (τ)YYR (τ)

None random Deterministic Function

Random Function Random Function

The Mean of the output,

E Y(t) = E h(ξ)X(t ξ) dξ

= X h(ξ)dξ

X

= h(ξ)E X(t ξ) dξ

= XH(0) = Y Constant

X Y = XH(0)

Page 31: Random Process

None random Deterministic Function

X(t) Y(t)=X(t)*h(t)h(t)

Linear SystemXXR (τ)YYR (τ)

None random Deterministic Function

Random Function Random Function

The Autocorrelation of the output,

X Y = XH(0)

YYR (t, t + τ) = E Y(t)Y(t + τ)

1 1 1 2 2 2

Y(t) Y(t + τ)

= E h(ξ )X(t ξ ) dξ h(ξ )X(t + τ ξ ) dξ

1 2 1 2 1 2= E X(t ξ )X(t + τ ξ ) h(ξ )h(ξ )dξ dξ

XX 1 2 1 2 1 2= R (τ + ξ ξ )h(ξ )h(ξ )dξ dξ

XX= R (τ) h( τ) h(τ)

XXR (τ) YY XXR (τ) = R (τ) h( τ) h(τ)

the mean is constant and the autocorrelationof the output

is a function of

Y(t) is WSS

Page 32: Random Process

None random Deterministic Function

X(t) Y(t)=X(t)*h(t)h(t)

Linear SystemXXR (τ)YYR (τ)

None random Deterministic Function

Random Function Random Function

X Y = XH(0)

XXR (τ) YY XXR (τ) = R (τ) h( τ) h(τ)

( )XXS f ( ) =YYS f ( )XXS f *( )H f ( )H f

2

( )H f

= ( )XXS f df

2[ ( )] = (0) = ( )YY YYE Y t R S f df

2

XX[ ( )]=R (0)E X t2

= ( ) ( )XXS f H f df

Total power of the Input

2= ( ) ( )XXS f H f

Total power of the Output