Expected Value and Variance for Continuous Random Variables€¦ · expected value gives the long term averages of sample values. Bernd Schroder¨ Louisiana Tech University, College

Post on 23-Jul-2020

1 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

logo1

Expected Value Variance

Expected Value and Variance forContinuous Random Variables

Bernd Schroder

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Introduction

1. The underlying ideas for expected value and variance arethe same as for discrete distributions.

2. The expected value gives us the expected long termaverage of measurements. (The Central Limit Theoremwill formally confirm this statement.)

3. The variance is a measure how spread out the distributionis.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Introduction1. The underlying ideas for expected value and variance are

the same as for discrete distributions.

2. The expected value gives us the expected long termaverage of measurements. (The Central Limit Theoremwill formally confirm this statement.)

3. The variance is a measure how spread out the distributionis.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Introduction1. The underlying ideas for expected value and variance are

the same as for discrete distributions.2. The expected value gives us the expected long term

average of measurements.

(The Central Limit Theoremwill formally confirm this statement.)

3. The variance is a measure how spread out the distributionis.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Introduction1. The underlying ideas for expected value and variance are

the same as for discrete distributions.2. The expected value gives us the expected long term

average of measurements. (The Central Limit Theoremwill formally confirm this statement.)

3. The variance is a measure how spread out the distributionis.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Introduction1. The underlying ideas for expected value and variance are

the same as for discrete distributions.2. The expected value gives us the expected long term

average of measurements. (The Central Limit Theoremwill formally confirm this statement.)

3. The variance is a measure how spread out the distributionis.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability

1. In the discrete expected value, the outcome x contributes asummand xP(X = x).

2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).

2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0

, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-

x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-

x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x

x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx

����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.

4. The summation becomes an integral.“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

From Discrete to Continuous Probability1. In the discrete expected value, the outcome x contributes a

summand xP(X = x).2. In the continuous setting, P(X = x) = 0, but

fX

-x x+dx����������

probability to

be in [x,x+dx] is

approximately

fX(x) dx (shaded)

3. So an interval [x,x+dx] should contribute about xfX(x) dx.4. The summation becomes an integral.

“Integrals are continuous sums.”

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition.

The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X)

:= µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX

:=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The expected value or mean of a continuousrandom variable X with probability density function fX is

E(X) := µX :=∫

−∞

xfX(x) dx.

This formula is exactly the same as the formula for the center ofmass of a linear mass density of total mass 1.

Cx =∫

−∞

xρ(x) dx.

Hence the analogy between probability and mass andprobability density and mass density persists.

As noted, the Central Limit Theorem will show that theexpected value gives the long term averages of sample values.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

q

E(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value.

Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value.

(So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The Expected Value Itself Need Not Be VeryLikely

qE(X)

So we should not necessarily expect measurements to givenumbers near the expected value. Instead, long term averageswill be near the expected value. (So maybe “expected average”would be more accurate, but “expected value” is customary.)

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem.

The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.

The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B)

=∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx

=∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

A

x1

B−Adx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)

=B+A

2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a randomvariable UA,B that is uniformly distributed over the interval

[A,B] is f (x;A,B) ={ 1

B−A ; for A≤ x≤ B,

0; otherwise.The expected

value is E(UA,B) =A+B

2.

Proof.

E(UA,B) =∫

−∞

xf (x;A,B) dx =∫ B

Ax

1B−A

dx

=1

2(B−A)x2∣∣∣∣BA

=B2−A2

2(B−A)=

B+A2

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A

BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A

BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A B

uB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A B

uB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A Bu

B−A2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

A BuB−A

2

AAA�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem.

The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof. Good exercise for integration by parts.

Warning. Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof. Good exercise for integration by parts.

Warning. Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof. Good exercise for integration by parts.

Warning. Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof.

Good exercise for integration by parts.

Warning. Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof. Good exercise for integration by parts.

Warning. Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof. Good exercise for integration by parts.

Warning. Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof. Good exercise for integration by parts.

Warning.

Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of an exponentiallydistributed random variable WΘ is

f (x;Θ) ={ 1

Θe−

xΘ ; for x≥ 0,0; otherwise.

The expected value is

E(WΘ) = Θ.

Proof. Good exercise for integration by parts.

Warning. Exponential distributions are also often given using

the parameter λ =1Θ

.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

tΘLLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

tΘLLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

tΘLLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

tΘLLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

tΘLLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

t

ΘLLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

LLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

tΘLLL�

��

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem.

The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof.

Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σ

leads todzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ )

=∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx

=∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz

= 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ

= µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. The probability density function of a normallydistributed random variable Nµ,σ with parameters µ and σ is

f (x; µ,σ) =1

σ√

2πe−

(x−µ)2

2σ2 .

The expected value is E(Nµ,σ ) = µ.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σ dz.

E(Nµ,σ ) =∫

−∞

x1

σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

x1

σ√

2πe−

( x−µσ )2

2 dx

=∫

−∞

(zσ + µ)1

σ√

2πe−

z22 σ dz

=∫

−∞

zσ1√2π

e−z22 dz+

∫∞

−∞

µ1√2π

e−z22 dz = 0+ µ = µ.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

sµBBB���

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

sµBBB���

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

sµBBB���

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

sµBBB���

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

s

µBBB���

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

BBB���

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Visualization

sµBBB���

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem.

If X is a continuous random variable with densityfunction fX and g(·) is a function, then

E(g(X)

)=∫

−∞

g(x)fX(x) dx.

Theorem. If X is a continuous random variable, g(·) and h(·)are functions, and a,b,c are numbers, then the expected valueof ag(X)+bh(X)+ c is

E(ag(X)+bh(X)+ c

)= aE

(g(X)

)+bE

(h(X)

)+ c

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. If X is a continuous random variable with densityfunction fX and g(·) is a function, then

E(g(X)

)=∫

−∞

g(x)fX(x) dx.

Theorem. If X is a continuous random variable, g(·) and h(·)are functions, and a,b,c are numbers, then the expected valueof ag(X)+bh(X)+ c is

E(ag(X)+bh(X)+ c

)= aE

(g(X)

)+bE

(h(X)

)+ c

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. If X is a continuous random variable with densityfunction fX and g(·) is a function, then

E(g(X)

)=∫

−∞

g(x)fX(x) dx.

Theorem.

If X is a continuous random variable, g(·) and h(·)are functions, and a,b,c are numbers, then the expected valueof ag(X)+bh(X)+ c is

E(ag(X)+bh(X)+ c

)= aE

(g(X)

)+bE

(h(X)

)+ c

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. If X is a continuous random variable with densityfunction fX and g(·) is a function, then

E(g(X)

)=∫

−∞

g(x)fX(x) dx.

Theorem. If X is a continuous random variable, g(·) and h(·)are functions, and a,b,c are numbers, then the expected valueof ag(X)+bh(X)+ c is

E(ag(X)+bh(X)+ c

)= aE

(g(X)

)+bE

(h(X)

)+ c

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurements

handmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+

++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ +

+ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++

+ ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ +

++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + +

+ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++

++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ +

+ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++

++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ +

+

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+

+++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +

++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ ++

+ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++

+ +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ +

+++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +

++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + ++

+ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++

++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ +

+

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

The “Spread” of Measurementshandmeasurement

-

average

+ ++ + ++ ++ ++

electronicmeasurement

-

average

+ +++ + +++ ++

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition.

The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X)

:= E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem.

V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.

V(X) = E((

X−E(X))2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X)

= E((

X−E(X))2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2

= E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Definition. The variance of a continuous random variable Xwith probability density function fX is

V(X) := E((

X−E(X))2)

=∫

−∞

(x−E(X)

)2fX(x) dx.

The standard deviation of X is σX :=√

V(X).

Theorem. V(X) = E(X2)− (E(X)

)2.

Proof.V(X) = E

((X−E(X)

)2)

= E(

X2−2XE(X)+(E(X)

)2)

= E(X2)−2E(X)E(X)+

(E(X)

)2 = E(X2)− (E(X)

)2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem.

Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B].

Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B)

= E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2

=∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4

=1

3(B−A)x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4

=(B−A)

(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4

=(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let UA,B be a random variable that is uniformly

distributed over the interval [A,B]. Then V(UA,B) =(B−A)2

12.

Proof.

V(UA,B) = E(

U2A,B

)−(E(UA,B)

)2 =∫

−∞

x2fA,B(x) dx−(

B+A2

)2

=∫ B

Ax2 1

B−Adx− (B+A)2

4=

13(B−A)

x3∣∣∣∣BA− (B+A)2

4

=B3−A3

3(B−A)− (B+A)2

4=

(B−A)(B2 +AB+A2)

3(B−A)− (B+A)2

4

=B2 +AB+A2

3− B2 +2AB+A2

4=

(B−A)2

12

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem.

Let WΘ be a random variable that is exponentiallydistributed with parameter Θ. Then

V(WΘ) = Θ2.

Proof. Good exercise in integration by parts.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let WΘ be a random variable that is exponentiallydistributed with parameter Θ.

Then

V(WΘ) = Θ2.

Proof. Good exercise in integration by parts.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let WΘ be a random variable that is exponentiallydistributed with parameter Θ. Then

V(WΘ) = Θ2.

Proof. Good exercise in integration by parts.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let WΘ be a random variable that is exponentiallydistributed with parameter Θ. Then

V(WΘ) = Θ2.

Proof.

Good exercise in integration by parts.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let WΘ be a random variable that is exponentiallydistributed with parameter Θ. Then

V(WΘ) = Θ2.

Proof. Good exercise in integration by parts.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let WΘ be a random variable that is exponentiallydistributed with parameter Θ. Then

V(WΘ) = Θ2.

Proof. Good exercise in integration by parts.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem.

Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ .

Then V(Nµ,σ ) = σ2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof.

Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ )

=∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx

=∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz

(integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2

[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]

= σ2 [0+1] = σ

2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2

[0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0

+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1]

= σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

logo1

Expected Value Variance

Theorem. Let Nµ,σ be a random variable that is normallydistributed with parameters µ and σ . Then V(Nµ,σ ) = σ

2.

Proof. Substitution z :=x−µ

σleads to

dzdx

=1σ

or dx = σdz.

V(Nµ,σ ) =∫

−∞

(x−µ)2 1σ√

2πe−

(x−µ)2

2σ2 dx =∫

−∞

(σz)2 1√2π

e−z22 dz

= σ2∫

−∞

z · z 1√2π

e−z22 dz (integration by parts)

= σ2[−z

1√2π

e−z22

∣∣∣∣z→∞

z→−∞

−∫

−∞

− 1√2π

e−z22 dz

]= σ

2 [0+1] = σ2.

Bernd Schroder Louisiana Tech University, College of Engineering and Science

Expected Value and Variance for Continuous Random Variables

top related