Top Banner
Introduction to Analog And Digital Communications Second Edition Simon Haykin, Michael Moher
101

Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

Sep 01, 2018

Download

Documents

trinhdiep
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

Introduction to Analog And Digital Communications

Second Edition

Simon Haykin, Michael Moher

Page 2: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

Chapter 8 Random Signals and Noise

8.1 Probability and Random Variables8.2 Expectation8.3 Transformation of Random Variables8.4 Gaussian Random Variables8.5 The Central Limit Theorem8.6 Random Processes8.7 Correlation of Random Processes8.8 Spectra of Random Signals8.9 Gaussian Processes8.10 Narrowband Noise8.11 Summary and Discussion

Page 3: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

3

Random Signals And Noise “random” is used to describe erratic and apparently unpredictable

variations of an observed signal. This randomness or unpredictability is a fundamental property of

information. If the information were predictable, there would be no need to

communicate, because the other end could predict the information before receiving it.

Noise may be defined as any unwanted signal interfering with or distorting the signal being communicated.

It is the purpose of this chapter to introduce the tools that are necessary to analyze information and noise and that are required to understand the signal detection techniques described in the remainder of the book.

Page 4: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

4

Lesson 1 : Random events and signals can be modeled as the outcomes of random experiments.

Lesson 2 : Random variables provide a general representation for analyzing, and processing outcomes of random experiments. Statistical properties of random events can be obtained from the expectation of the various functions of these fandom variables; the expectation should be viewed as an operator.

Lesson 3 : Gaussian random variables play a key role in the analysis of random signals because of the central limit theorem and their mathematical tractability.

Lesson 4 : A random process may be viewed as a family of random variables, indexed by a time parameter. Then we can extend the analysis techniques for random variables to study the time-variation of random process.

Lesson 5 : White noise is one of the most important random processes in the study of communication systems, both from a practical viewpoint and for the mathematical tractability of its statistical properties.

Lesson 6 : Narrowband noise can be analyzed in terns of its in-phase and quadrature components, in a manner similar to narrowband communication signals.

Page 5: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

5

8.1 Probability and Random Variables

We ask that a random experiment have the following properties:1. On any trial of the experiment, the outcome is unpredictable.2. For a large number of trials of the experiment, the outcomes exhibit statistical

regularity. That is, a definite average pattern of outcomes is observed if the experiment is repeated a large number of times.

Relative-Frequency Approach The relative frequency is a nonnegative real number less than or equal to one.

The experiment exhibits statistical regularity if, for any sequence of n trials, the relative frequency converges to a limit as n becomes large.

probability of event A.

)1.8(10 ≤≤nnA

nnA /

)2.8(lim][P

=

∞→ nnA A

n

Page 6: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

6

The Axioms of probability Each possible outcome of the experiment, we associate a point called

the sample point. The totality of all sample point Sanple space The entire sample space S Sure event, the null set null or impossible event; a single sample point

elementary event. A formal definition of probability. A probability system consists of the

triple:1. A sample space S of elementary events (outcomes).2. A class of events that are subsets of S.3. A probabilitry measure P[A] assigned to each event A in the class ,

which has the following properties:

φ

εε

thenclasstheineventsexclusivemutuallytwoofuniontheisBAIfiiiAPii

SPi

,)(1][0)(

1][)(

ε

≤≤=

Page 7: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

7

The axioms of probability.

Mathematical form similar to that of axiom (iii). This abstract definition of a probability system is illustrated in

Fig.8.2

)3.8(][P][P][P BABA +=

)4.8(nn

nn

nn BABA +=

Page 8: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

8

Fig. 8.1 Back Next

Fig. 8.1

Page 9: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

9

Fig. 8.2 Back Next

Fig. 8.2

Page 10: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

10

Random Varlables A function whose domain is a sample space and whose range is a set of

real numbers is called a random variable of the experiment. There may be more than one random variable associated with the same

random experiment. The comcept of a random variable is illustrated in Fig.8.3 For a discrete-valued random variable, the probability mass function

describes the probability of each possible value of the random variable. For the coin-tossing experiment, the probability mass function of the associated random variable may be written as

)5.8(otherwise,0

1 , 0 ,

][P21

21

==

== xx

xX

Page 11: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

11

Fig. 8.3 Back Next

Fig. 8.3

Page 12: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

12

Fig. 8.4 Back Next

Fig. 8.4

Page 13: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

13

Page 14: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

14

Distribution Functions Closely related to the probability mass function is the probability

distribution funciton.

The distribution funciton has two basic properties1. The distribution function is bounded between zero and one.2. The distribution function is a monotone nondecreasing

function of x; that is,

)7.8(][P)( xXxFX ≤=

)(xFX

)(xFX

2121 if)()( xxxFxF XX ≤≤

Page 15: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

15

Probability density function, denoted by

A probability density function has three basic properties:1. Since the distribution function is monotone nondecreasing, it follows

that the density function is nonnegative for all values of x.2. The distribution function may be recovered from the density function

by integration, as shown by

3. Property 2 implies that the total area under the curve of the density function is unity.

)(xFX

)8.8()()( xFx

xf XX ∂∂

=

)9.8()()( ∫ ∞−=

x

XX dssfxF

Page 16: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

16

Page 17: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

17

Fig. 8.5

Page 18: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

18

Fig. 8.5 Back Next

Page 19: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

19

Several Random Variables Joint distribution function The probability that the random variable X is less than or equal to a

specified value x and that the random variable Y is less than or equal to a specified value y.

the joint probability density function.

),(, yxF YX

)11.8(],[P),(, yYxXyxF YX ≤≤=

)12.8(),(

),( ,2

, yxyxF

yxf YXYX ∂∂

∂=

),(, yxF YX

)13.8(),()( ,∫∫ ∞−

∞−=

x

YXX ddfxF ηξηξ

)14.8(),()( ,∫∞

∞−= ηη dxfxf YXX

Page 20: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

20

The probability density function may be obtained from the joint probability density function by simply integrating over all possible value of the random variable Y.

The probability density functions and marginal densities.

Statistically independent if the outcome of X does cot affect the outcome Y.

)(xFX

),(, yxF YX

)(xFX)(yfY

)15.8(][P][P],[P BYAXBYAX ∈∈=∈∈

)16.8()()(),(, yFxFyxF YXYX =

Page 21: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

21

Page 22: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

22

Fig. 8.6

Page 23: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

23

Fig. 8.6 Back Next

Page 24: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

24

Fig. 8.7

Page 25: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

25

Fig. 8.7 Back Next

Page 26: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

26

Conditional Probability denote the probability mass function of Y given that X has occurred. Conditional probability of Y given X.

Bayes’ rule.

Random variables X and Y that satisfy this condition Statistically indipendent.

]|[P XY

)22.8(][P],[P]|[P

XYXXY =

)23.8(][P]|[P],[P XXYYX =

)24.8(][P]|[P],[P YYXYX =

)25.8(][P

][P]|[P]|[PX

YYXXY =

)26.8(][P]|[P YXY =

)27.8(][P][P],[P YXYX =

Page 27: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

27

Page 28: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

28

Page 29: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

29

Fig. 8.8

Page 30: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

30

Fig. 8.8 Back Next

Page 31: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

31

8.2 Expectation

Mean Statistical averages expectations For the expected value of a function(.) of the random variable

X. For a discrete random variable X, the mean, , is the weighted sum

of the possible outcomes.

The analogous definition of the expected value is

)]([ XgE

)32.8(][P

][E

xXx

X

X

X

==

=

∑µ

)33.8()(][E dxxxfX X∫∞

∞−=

Page 32: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

32

estimator The mean value of a random variable is estimated from N observations

If we consider X as a random variable representing observations of the voltage of a random signal, then the mean value of X represents the average voltage or dc offset of the signal.

)34.8(1ˆ1

X ∑=

=N

nnx

NnMnn M⋅+⋅⋅⋅+⋅+⋅

= 21Z

21µ̂

=

=

=≈

=

M

i

M

i

i

iZi

Nni

1

1Z

][P

µ̂

Page 33: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

33

Variance The variance, By the expectation of the squared distance of each outcome from the mean

value of the distribution.

X2σ

)35.8(][P)(

])[(E )(Var

2

2

2

xXx

XX

XX

X

X

=−=

−=

=

∑ µ

µσ

)36.8()()( 22 dxxfx XXX µσ ∫∞

∞−−=

)37.8()ˆ(1

1ˆ1

22 ∑=

−−

=N

nXnX x

Nµσ

Page 34: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

34

Eq.(8.37) is an unbiased estimator of the variance

mean-square value of the random signal Total power of the signal.

)38.8(]P[)ˆ(1

)ˆ(1

1)ˆ()ˆ2()ˆ1(ˆ

1

2

1

2

22

21

22

=

=

=−−

−−

=

−⋅−+⋅⋅⋅+⋅−+⋅−

=

M

iZ

M

n

iZ

MZZZZ

iZiN

NNni

NN

NnMnn

i

µ

µ

µµµσ

][ 2XE

Page 35: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

35

Page 36: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

36

Covariance The covariance of two random variables, X and Y

The expectation term of Eq. (8.40) is

If the two random variables happen to be independent

We find that the covariance of independent random variables is zero. Zero covariance does not, in general, imply independence.

)39.8()])([(E),(Cov YX YXYX µµ −−=

)40.8(][E),(Cov YXXYYX µµ−=

)41.8(),(][E ,∫ ∫∞

∞−

∞−= dxdyyxxyfXY YX

)42.8( ][E][E

)()(

)()(][E

YX

dyyyfdxxxf

dxdyyfxxyfXY

XX

YX

=

=

=

∫∫∫ ∫

∞−

∞−

∞−

∞−

Page 37: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

37

8.3 Transformation of Random Variables

What is the new distribution of the random variable after processing? If it follows that where B is defined by B = aA + b

The set A is gives by

AX ∈ BY ∈

)43.8(][P][P BYAX ∈=∈

]/)(,(/)( abyabBA −−∞=−=

)44.8(

]]/)(,([P ]],([P)(

=

−−∞∈=−∞∈=

abyF

abyXyXyF

X

Y

Page 38: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

38

If Y = g(X) is a one-to-one transformation of the random variable X to the random variable X to the random variable Y

Functional inverse of g(y)

)45.8())(()( 1 ygFyF XY−=

)(1 yg −

Page 39: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

39

Fig. 8.9

Page 40: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

40

Fig. 8.9 Back Next

Page 41: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

41

Page 42: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

42

8.4 Gaussian Random Variables

A Gaussian random variable

A Gaussian random variable has a number of properties that we will state without proof

1. A Gaussian random variable is completely characterized by its mean and variance.

2. A Gaussian random variable plus a constant is another Gaussian random variable with the mean adjusted by the constant.

3. A Gaussian random variable multiplied by a constant is another Gaussian random variable where both the mean and variance are affected by the constant.

4. The sum of two independent Gaussian random variable is also a Gaussian random variable.

5. The weighted sum of N independent Gaussian random variables is a Gaussian random variable.

6. If two Gaussian random variable have zero covariance (uncorrelated), they are also independent.

)48.8(2

)(exp2

1)( 2

2

−−=

XXX

xxxfσµ

σπ

Page 43: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

43

For the special case of a Gaussian random variable with a mean of zero,

Normalized Gaussian random variable

Q-funciton The complement of the normalized Gaussian distribution function.

0=Xµ

)49.8(),2/exp(21)( 2 ∞<<∞−−= xxxf X π

)50.8()2/exp(21

)()(

2∫

∞−

∞−

−=

=

x

x

XX

dss

dssfxF

π

)51.8( )(1

)2/exp(21)( 2

xF

dssxQ

X

x

−=

−= ∫∞

πFig. 8.10

Fig. 8.11

Page 44: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

44

Fig. 8.10 Back Next

Page 45: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

45

Fig. 8.11 Back Next

Page 46: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

46

Fig. 8.12

Page 47: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

47

Fig. 8.12 Back Next

Page 48: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

48

Page 49: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

49

8.5 The Central Limit Theorem

The Gaussian distribution is the central limit theorem.1. The with k = 1, 2, 3…., n are statistically indepindint.2. The all have the same probability density function.3. Both the mean and the variance exist for each .

The normalized random variable

A zero-mean Gaussian random variable with unit variance

kXkX

kX

)56.8(1∑=

=n

kkXY

)57.8(][EY

YYZσ−

=

)58.8(2

exp21)(

2

dsszFz

Z ∫ ∞−

−→

π

Page 50: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

50

A mathematical statement of the central limit theorem. The normalized distribution of the sum of independent, identically

distributed random variables approaches a Gaussian distribution as the number of random variables increases, regardless of the individual distributions.

Computer Experiment : Sums of random variables We consider the random variable

Where the are independent, uniformly distributed random variables on the interval from -1 to +1. In the computer experiment, we compute 20,000 samples of Z for N = 5, and estimate the corresponding density function by forming a histogram of the results.

The results of this experiment indicate how powerful the central limit theorem is and explain why Gaussian models are ubiquitous in the analysis of random signals in communications and elsewhere.

∑=

=N

iiXZ

1

iX

Fig. 8.13

Page 51: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

51

Fig. 8.13 Back Next

Page 52: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

52

8.6 Random Process

We combine the concepts of time variation and random variables to introduce the concept of random processes.

Random processes have the following properties:1. Random processes are functions of time.2. Random processes are random in the sense that it is not possible to

predict exactly what waveform will be observed in the future. Suppose that we assign to each sample point s a function of time

with the label

This sample function as

)59.8(),,( TtTstX <<−

)60.8(),()( jj stXtx =

)},(),...,,(),,({)}(),...,(),({ 2121 nkkkknkk stXstXstXtxtxtx =

Page 53: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

53

To compare: With a random variable, the outcome of random experiment is mapped

to a real number. With a random process, the outcome of random experiment is mapped

into a waveform that is a function of time. The family of all such random variables, indexed by the time

variable t, forms the random process. Stationary random processes Stationary If a random process is divided into a number of time intervals, the various

sections of the process exhibit essentially the same statistical properties. Otherwise, nonstationary. It is said to be nonstationary.

Fig. 8.14

Page 54: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

54

Suppose the same random process is observed at time , and the corresponding distribution function is .

A first-order stationary random process A first-order stationary random process has a distribution function that

is independent of time. Suppose has the density . The mean value

Does not change with time because the distribution function (and hence the density) are time invariant.

τ+1t)()( 1

xF tX τ+

)61.8()()( )()( 11xFxF tXtX =+τ

)()( 1xF tX

)()( 1xf tX

)62.8()()( )()( 11dsssfdsssf tXtXX ∫∫

∞−+

∞−== τµ

Page 55: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

55

Suppose a second set of observations are made at times and, and the corresponding joint distribution is

Stationary to the second order. Statistical quantities such as covariance and correlation, which we will

discuss in the following, do not depend upon absolute time. In other words, a random process X(t) is strictly stationary of the joint

distribution of any set of random variables obtained by observing the random process X(t) is invariant with respect to the location of the origin t = 0.

τ+1tτ+2t ),( 21)(),( 21

xxF tXtX ττ ++

)63.8(),(),( 21)(),(21)(),( 2121xxFxxF tXtXtXtX =++ ττ

Page 56: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

56

Fig. 8.14 Back Next

Page 57: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

57

8.7 Correlation of Random Process

The covariance of the two random variables and is given by

1. The mean of the random process is a constant independent of time: for all t.

2. The autocorrelation of the random process only depends upon the time difference:

Wide-sense stationary or weakly stationary. If a random process has these two properties, then we say it is wide-

sense stationary or weakly stationary

)( 1tX )( 2tX

)64.8()]()([E))(),((Cov )()(2121 21 tXtXtXtXtXtX µµ−=

)65.8()](*)([E),( sXtXstRX =

)66.8()( )](*)([E),(

stRsXtXstR

X

X

−==

XtXE µ=)]([

τττ andtallforRtXtXE X ),()]()([ * =−

Page 58: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

58

Properties of the autocorrelation function Property 1 Power of a Wide-Sense Stationary Process The second moment or mean-square value of a real-valued random process

is given by

Property 2 Symmetry The autocorrelation of a real-valued wide-sense stationary process has

even symmetry.

Property 3 Maximum Value The autocorrelation function of a wide-sense stationary random process is

a maximum at the origin.

)67.8()]([E )]()([E)0(

2 tXtXtXRX

=

=

)68.8()( )]()([E )]()([E)(

ττ

ττ

X

X

RtXtX

tXtXR

=+=

+=−

)69.8()(2)0(2 )]()([E2)]([E)]([E

]))()([(E022

2

τττ

τ

XX RRtXtXtXtX

tXtX

±≤−±−+≤

−±≤Fig. 8.15

Page 59: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

59

Fig. 8.15 Back Next

Page 60: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

60

Page 61: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

61

Page 62: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

62

Erfodicity Ensemble The expected value and second moment of the random process at time

The time average of a continuous sample function drawn from a real-valued process is given by

Time-autocorrelation of the sample function

ktt =

)70.8()(1)]([E1∑=

=N

jkjk tx

NtX

)71.8()(1)]([E1

22 ∑=

=N

jkjk tx

NtX

)72.8()(21lim][ dttxT

xT

TT ∫−∞→=ε

)73.8()()(21lim)( dttxtxT

RT

TTx ∫−∞→−= ττ

Page 63: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

63

When are the time averages of a sample function equal to the ensemble averages of the corresponding random process

Of the statistics of the random process X(t) do not change with time, then we might expect the time averages and ensemble averages to be equivalent.

Ergodic. Random processes for which this equivalence holds are said to be

ergodic. Wide-sense stationary processes are assumed to be ergodic, in

which case time averages and expectations can be used interchangeably.

Page 64: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

64

If we assume that the real-valued random process is ergodic

An estimate of the autocorrelation of a real-valued process for is (see Fig. 2.29)

)74.8()()(21lim

)]()([E)(

dttxtxT

tXtXRT

TT

X

∫−∞→−=

−=

τ

ττ

0ττ =lag

)75.8()()(1)(ˆ1

00 ∑=

−=N

nnnX txtx

NR ττ

Page 65: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

65

Fig. 8.16

Page 66: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

66

Fig. 8.16 Back Next

Page 67: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

67

8.8 Spectra of Random Signals

Figure 8.17 shows a plot of the waveform of on the interval -T<t<T.

Fourier transform of the sample function as

Effectively, the Fourier transform has converted a family of random variables X(t) indexed by parameter t to a new family of random variables indexed by parameter f.

The ensemble-averaged value of the new random process asand the corresponding power spectral density of the

random process X(t) as

)(txT

)(txT

)78.8()2exp()()( ∫∞

∞−−= dtftjtxf TT πξ

)( fTΞ2)( fTΞ

[ ]2)( fE TΞ

)79.8(]|)([|E21lim)( 2fT

fS TTX Ε∞→

=

Fig. 8.17

Page 68: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

68

Fig. 8.17 Back Next

Page 69: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

69

The discrete Fourier transform is defines as

We may estimate the power spectral density of a random process by the following three steps:

1. Partition the sample function x(t) into M sections of length and sample at intervals

2. Perform a DFT on each section of length . Let where m = 0,…,M-1 represent the M DFT outputs, one set for each esction.

3. Average the magnitude squared of each DFT, and then the power spectral density estimate is given by

)80.8(W1

0∑

=

=N

n

knnk xξ

sNTsT

sNT { }mNk+ξ

)81.8(1,...,0,W1

||1ˆ

1

0

21

0

1

0

2

∑∑

∑−

=

=+

=+

−==

=

M

m

N

n

knmNn

M

mmNk

sX

MkxM

MNTkS ξ

Page 70: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

70

Properties of the power spectral density Weiner-Khintchine relations

The Weiner-Khintchine relations show that if either the autocorrelation or the power spectral density of the random process is known, the other may be found exactly.

Property 1 Mean-Square Value The mean-square value of a stationary process equals the total area

under the graph of the power spectral density; that is,

∫∞

∞−−= )82.8()2exp()()( ττπτ dfjRfS XX

∫∞

∞−= )83.8()2exp()()( dffjfSR XX τπτ

∫∞

∞−= )84.8()(]|)([|E 2 dffStX X

]|)([|E)0( 2tXRX =

Page 71: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

71

Property 2 Nonnegativity The power spectral density of a stationary random process is always

nonnegative; that is,

Property 3 Symmetry The power spectral density of a real random process is an even function

of frequency; that is,

)85.8( allfor ,0)( ffSX ≥

)86.8()()( fSfS XX =−

∫∞

∞−=− ττπτ dfjRfS XX )2exp()()(

)()2exp()()( fSdfjRfS XXX =−=− ∫∞

∞−ττπτ

Page 72: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

72

Property 4 Filtered random process If a stationary random process X(t) with spectrum is passed

through a linear filter with frequency response H(f), the spectrum of the stationary output random process Y(t) is given by

)( fSX

)87.8()(|)(|)( 2 fSfHfS XY =

Page 73: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

73

Page 74: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

74

Page 75: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

75

8.9 Gaussian Processes

The joint distribution of N Gaussian random variables may be written as

Represents an N-dimensional vector of Gaussian random variables

is the corresponding vector of indeterminates. is the N-dimensional vector of means is the N-by-N covariance matrix with individual elements gives

by

)90.8(}2/)μx()μx(exp{||)2(

1)x( 12/12/

TNXf −Λ−−

Λ= −

π

)( ,,....2,1 NXXXX =

)( ,,....2,1 Nxxxx =

])[],....,[],[( 21 NXEXEXE=µΛ

),( jiij XXCov=Λ

Page 76: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

76

A Gaussian process has the following properties:1. If a Gaussian process is wide-sense stationary, then it is also stationary

in the strict sense.2. If a Gaussian process is applied to a stable linear filter, then the

random process Y(t) produced at the output of the filter is also Gaussian.

3. If integration is defined in the mean-square sense, then we may interchange the order of the operations of integration and expectation with a Gaussian random process.

This first property comes from observing that if a Gaussian process is wide-sense stationary

The second property comes from observing that the filtering operation can be written as

)91.8()()()(0

dssXsthtYt

∫ −=

Page 77: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

77

1. The integral of Eq. (8.91) is defined as the mean-square limit of the sums

we observe that the right-hand side is a weighted sum of the Gaussian random variables

2. Recall from the properties of Gaussian random variables that a weighted sum of Gaussian random variables is another Gaussian random variable.

3. If a sequence of Gaussian random variables converges in the mean-square sense, then the result is a Gaussian random variable.

The third property of Gaussian processes implies that if Y(t) is given by Eq.(8.91) then the mean of the output is given by

∑ ∆∆∆−=→∆

is

ssixsithtY )()(lim)(0

).( siX ∆

)(

)]([E)(

)()(E)]([E

0

0

t

dssXsth

dssXsthtY

Y

t

t

µ=

−=

−=

Page 78: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

78

8.10 White Noise

The power spectral density of white noise is independent of frequency.

We denote the power spectral density of a white noise process W(t)

The autocorrelation of white noise is given by

White noise has infinite average power and, as such, it is not physically realizable.

White noise has convenient mathematical properties and is therefore useful in system analysis.

)92.8(2

)( 0W

NfS =

)93.8()(2

)( 0W τδτ NR =

Fig. 8.18

Page 79: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

79

Fig. 8.18 Back Next

Page 80: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

80

Fig. 8.19

Page 81: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

81

Fig. 8.19 Back Next

Page 82: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

82

Fig. 8.20

Page 83: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

83

Fig. 8.20 Back Next

Page 84: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

84

8.11 Narrowband Noise

Narrowband noise. The noise process appearing at the output of such a filter

Narrowband noise can be represented mathematically using in-phase and quadrature components

is called the in-phase component of N(t) and is the quadrature component.

)97.8()2sin()()2cos()()( tftNtftNtN cQcI ππ −=

)(tNI )(tNQ

Fig. 8.21

Fig. 8.22

Page 85: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

85

Fig. 8.21 Back Next

Page 86: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

86

Fig. 8.22 Back Next

Page 87: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

87

The in-phase and quadrature components of narrowband noise have the following important properties:

1. The in-phase component and quadrature component of narrowband noise N(t) have zero mean.

2. If the narrowband noise N(t) is Gaussian, then its in-phase and quadrature components are Gaussian.

3. If the narrowband noise N(t) is stationary, then its in-phase and quadrature components are stationary.

4. Both the in-phase component and have the same power spectral density. This power spectral density is related to the power spectral density of the narrowband density by

5. The in-phase component and quadrature componenthave the same variance as the narrowband noise N(t).

)(tNI )(tNQ

)(tNI )(tNQ

)( fSN

)98.8(otherwise ,0

),()()()(

≤≤−++−

==BfBffSffS

fSfS cNcNNN QI

)(tNI )(tNQ

Fig. 8.23

Page 88: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

88

Fig. 8.23 Back Next

Page 89: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

89

The spectrum of the in-phase component of the narrowband noise is given by

The is-phase component has the same variance (power) as the narrowband noise.

≤≤−

= otherwise ,0

,)( 0 BfBN

fSIN

BNdffSdffSINN 02)()( == ∫∫

∞−

∞−

Page 90: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

90

Fig. 8.23

Page 91: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

91

Noise –equivalent bandwidth Noise –equivalent bandwidth Power spectral densities , the average output noise power is

The average output noise power is

)102.8( |)(|

2|)(|

)(|)(|

0

20

02

W2

∞−

∞−

=

=

=

dffHN

dfNfH

dffSfHPN

)103.8(|)0(| 20 HBNP NN =

)104.8(|)0(|

|)(|2

0

2

H

dffHBN

∫∞

=

Page 92: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

92

noise-equivalent bandwidth for a low-pass filter. The noise-equivalent bandwidth for a band-pass filter, as illustrated in

Fig.8.25. The noise-equivalent bandwidth for a band-pass filter may be defined as

The general result

And the effect of passing white noise through a filter may be separated into two parts: The center-frequency power gain The noise equivalent bandwidth ,representing the frequency selectivity of

the filter.

Fig. 8.24 Fig. 8.25

NB

)105.8(|)(|

|)(|2

0

2

cN fH

dffHB ∫

=

)106.8(|)(| 20 NcN BfHNP =

.)( 2

cfHNB

Page 93: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

93

Fig. 8.24 Back Next

Page 94: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

94

Fig. 8.25 Back Next

Page 95: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

95

Page 96: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

96

8.12 Summary and Discussion

We introduced a random experiment as a model for unpredictable phenomena; and the relative frequency definition of probability as a means of assigning a likelihood to the outcome of a random experiment.

Random variables were introduced as a domain is the sample space of the random experiment and whose range is the real numbers.

The concept of expectation and the statistical moments and covariance of random variables.

Gaussian random variables were introduced as a particular important type of random variable in the study of communication systems.

A random process was defined as a family of random variables indexed by time as a parameter.

Gaussian processes and white noise were introduced as important random processes in the analysis of communication systems.

This narrowband noise has in-phase and quadrature components, similar to deterministic signals.

The two subsequent chapters will illustrate the importance of the material presented in this chapter in designing receivers and evaluating communication system performance.

Page 97: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

97

Fig. 8.26 Back Next

Fig. 8.26

Page 98: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

98

Fig. 8.27 Back Next

Fig. 8.27

Page 99: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

99

Fig. 8.28 Back Next

Fig. 8.28

Page 100: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

100

Fig. 8.29 Back Next

Fig. 8.29

Page 101: Introduction to Analog And Digital Communications …bungae.kaist.ac.kr/courses/spring2010/ee321/Haykin_lecture_note_pdf... · Introduction to Analog And Digital Communications Second

101

Fig. 8.5 Back Next

Fig. 8.5