Click here to load reader
Feb 01, 2017
Chapter 2Unbiased Estimation
If the average estimate of several randomsamples is equal to the population parameterthen the estimate is unbiased. For example, if credit card holders in a city wererepetitively random sampled and questioned what their account balances were asof a specific date, the average of the results across all samples would equal thepopulation parameter. If, however, only credit card holders in one specific businesswere sampled, the average of the sample estimates would be biased estimator of allaccount balances for the city and would not equal the population parameter.
If themean value of an estimator in a sample equals the true value of the populationmean then it is called an unbiased estimator. If themean value of an estimator is eitherless than or greater than the true value of the quantity it estimates, then the estimatoris called a biased estimator. For example, suppose you decide to choose the smallestor largest observation in a sample to be the estimator of the population mean. Such anestimator would be biased because the average of the values of this estimator wouldbe always less or more than the true population mean.
2.1 Unbiased Estimates and Mean Square Error
Definition 2.1.1 A statistics T(X) is called an unbiased estimator for a function ofthe parameter g(), provided that for every choice of ,
ET(X) = g() (2.1.1)
Any estimator that is not unbiased is called biased. The bias is denoted by b().
b() = ET(X) g() (2.1.2)
Springer Science+Business Media Singapore 2016U.J. Dixit, Examples in Parametric Inference with R,DOI 10.1007/978-981-10-0889-4_2
39
40 2 Unbiased Estimation
We will now define mean square error (mse)
MSE[T(X)] = E[T(X) g()]2= E[T(X) ET(X) + b()]2= E[T(X) ET(X)]2 + 2b()E[T(X) ET(X)] + b2()= V[T(X)] + b2()= Variance of [T(X)] + [bias of T(X)]2
Example 2.1.1 Let (X1,X2, . . . ,Xn) be Bernoulli rvs with parameter , where isunknown. X is an estimator for . Is it unbiased ?
EX = 1n
n
i=1Xi = n
n=
Thus, X is an unbiased estimator for .We denote it as = X .
Var(X) = 1n2
n
i=1V(Xi) = n(1 )
n2= (1 )
n
Example 2.1.2 Let Xi(i = 1, 2, . . . , n) be iid rvs from N(,2), where and 2 areunknown.
Define nS2 =ni=1(Xi X)2 and n2 =n
i=1(Xi )2Consider
n
i=1(Xi )2 =
n
i=1(Xi X + X )2
=n
i=1(Xi X)2 + 2
n
i=1(Xi )(X ) + n(X )2
=n
i=1(Xi X)2 + n(X )2
Therefore,
n
i=1(Xi X)2 =
n
i=1(Xi )2 n(X )2
2.1 Unbiased Estimates and Mean Square Error 41
E
[n
i=1(Xi X)2
]= E
[n
i=1(Xi )2
] nE[(X )2]
= n2 n2
n= n2 2
Hence,
E(S2) = 2 2
n= 2
(n 1n
)
Thus, S2 is a biased estimator of 2.Hence
b(2) = 2 2
n 2 =
2
n
Further, nS2
n1 is an unbiased estimator of 2.
Example 2.1.3 Further, if (n 1)S2 = ni=1(Xi X)2, then (n 1)S2
2has 2 with
(n 1) df. Here, we examine whether S is an unbiased estimator of .Let (n 1)S
2
2= w
Then
E(w) =
0
w12 e w2 wn12 1
(n12
)2
n12
dw
= (n2
)2
n2
(n12
)2
n12
= (n2
)2
12
(n12
)
E
[(n 1) 12 S
]= 2
12 (n2
)
(n12
)
Hence
E(S) = 212 (n2
)
(n12
) (n 1) 12 =
(2
n 1) 1
2 (n2
)
(n12
)
Therefore,
E
(S
)=(
2
n 1) 1
2 (n2
)
(n12
)
42 2 Unbiased Estimation
Therefore,
Bias(S) = [(
2
n 1) 1
2 (n2
)
(n12
) 1]
Example 2.1.4 For the family (1.5.4), p is U-estimable and is not U-estimable. For(p, ), it can be easily seen that p = rn and Ep = p. Next, we will show is notU-estimable.
Suppose there exist a function h(r, z) such that
Eh(r, z) = (p, ) .
Since
EE[h(r, z)|r] =
We get
n
r=1
(n
r
)prqnr
0
h(r, z)e z zr1dz
r(r)+ qnh(0, 0) =
Substituting pq = , and dividing qn on both sides
n
r=1r(n
r
)
0
h(r, z)e z zr1dz
r(r)+ h(0, 0) = (1 + )n, Since q = (1 + )1
Comparing the coefficients of r in both sides, we get, h(0, 0) = , which is acontradiction.Hence, there does not exist any unbiased estimator of . Thus is not U-estimable.
Example 2.1.5 Let X is N(0,2) and assume that we have one observation. What isthe unbiased estimator of 2?
E(X) = 0
V(X) = EX2 (EX)2 = 2
http://dx.doi.org/10.1007/978-981-10-0889-4_1
2.1 Unbiased Estimates and Mean Square Error 43
Therefore,
E(X2) = 2
Hence X2 is an unbiased estimator of 2.
Example 2.1.6 Sometimes an unbiased estimator may be absurd.
Let the rv X be P() and we want to estimate (), where
() = exp[k]; k > 0
Let T(X) = [(k 1)]x; k > 1
E[T(X)] =
x=0[(k 1)]x e
x
x!
= e
x=0
[(k 1)]xx!
= ee[(k1)]= ek
T(x) ={[(k 1)]x > 0; x is even and k > 1[(k 1)]x < 0; x is odd and k > 1
which is absurd since () is always positive.
Example 2.1.7 Unbiased estimator is not unique.
Let the rvsX1 andX2 areN(, 1).X1,X2, andX1+(1)X2 are unbiased estimatorsof , 0 1.Example 2.1.8 Let X1,X2, . . . ,Xn be iid rvs from Cauchy distribution with parame-ter . Find an unbiased estimator of .
Let
f (x|) = 1[1 + (x )2] ; < x < , < <
F(x|) =x
dy
[1 + (y )2]
= 12
+ 1tan1(x )
Let g(x(r)) be the pdf of X(r), where X(r) is the rth order statistics.
44 2 Unbiased Estimation
g(x(r)) = n!(n r)!(r 1)! f (x(r))[F(x(r))]
r1[1 F(x(r))]nr
= n!(n r)!(r 1)!
[1
1
[1 + (x(r) )2]
][1
2+ 1
tan1(x(r) )
]r1 [ 12
1tan1(x(r) )
]nr
E(X(r) ) = n!(n r)!(r 1)!
1
x(r) [1 + (x(r) )2]
[1
2+ 1
tan1(x(r) )
]r1
[1
2 1
tan1(x(r) )nr
]dx(r)
Let (x(r) ) = y
E(X(r) ) = Crn 1
y
1 + y2[1
2+ 1
tan1 y
]r1 [12
1tan1 y
]nrdy,
where Crn = n!(nr)!(r1)!Let
u = 12
+ 1tan1 y u 1
2= 1
tan1 y
(u 1
2
) = tan1 y y = tan
(u 1
2
) y = cot u
dy = [(cosu)(cosu)
sin2 u+ sin u
sin u
]du
= [cot2 u + 1] = [y2 + 1]du
E(X(r) ) = n!(n r)!(r 1)!
1
0
ur1(1 u)nr cot udu
= Crn1
0
ur1(1 u)nr cot udu
2.1 Unbiased Estimates and Mean Square Error 45
Replace r by n r + 1
E(X(nr+1) ) = n!(n r)!(r 1)!
1
0
cot(u)unr(1 u)r1du
Let 1 u = w
= n!(n r)!(r 1)!
1
0
(1) cot[(1 w)](1 w)nrwr1dw
= n!(n r)!(r 1)!
1
0
cot(w)(1 w)nrwr1dw
Now
1
0
ur1(1 u)nr cot udu =1
0
cot(w)(1 w)nrwr1dw
E[(x(r) ) + (x(nr+1) )] = 0
E[X(r) + X(nr+1)] = 2
= x(r) + x(nr+1)2
Therefore, x(r) + x(n r + 1)2 is an unbiased estimator of .Note: Moments of Cauchy distribution does not exist but still we get an unbiasedestimator of .
Example 2.1.9 Let X be rv with B(1, p). We examine whether p2 is U-estimable.
Let T(x) be an unbiased estimator of p2
1
x=0T(x)px(1 p)1x = p2
T(0)(1 p) + T(1)p = p2
46 2 Unbiased Estimation
p[T(1) T(0)] + T(0) = p2
Coefficient of p2 does not exist.Hence, an unbiased estimator of p2 does not exist.
Empirical Distribution Function
Let X1,X2, . . . ,Xn be a random sample from a continuous population with df F andpdf f . Then the order statistics X(1) X(2) X(n) is a sufficient statistics.Define F(x) = Number of X i s xn , same thing can bewritten in terms of order statistics as,
F(x) =
0 ; X(1) > xkn ; X(k) x < X(k+1)1 ; x X(n)
= 1n
n
j=1I(x X(j))
where
I(y) ={1; y 00; otherwise
Example 2.1.10 Show that empirical distribution function is an unbiased estimatorof F(x)
F(x) = 1n
n
j=1I(x X(j))
EF(x) = 1n
n
j=1P[X(j) x]
= 1n
n
j=1
n
k=j
(n
k
)[F(x)]k[1 F(x)]nk (see (Eq. 20 in Prerequisite))
= 1n
k
j=1
n
k=1
(n
k
)[F(x)]k[1 F(x)]nk
2.1 Unbiased Estimates and Mean Square Error 47
= 1n
n
k=1
(n
k
)[F(x)]k[1 F(x)]nk
k
j=1(1)
= 1n
n
k=1k
(n
k
)[F(x)]k[[1 F(x)]nk
= 1n[nF(x)] = F(x)
Note: One can see that I(xX(j)) is a Bernoulli random variable. Then EI(xX(j)) =F(x), so that EF(x) = F(x). We observe that F(x) has a Binomial distribution withmean F(x) and variance F(x)[1F(x)]n . Using central limit theorem, for iid rvs, we canshow that as n
n
[F(x) F(x)F(x)[1 F(x)]
] N(0, 1).
2.2 Unbiasedness and Sufficiency