Students Manual to AccompanyIntroduction toProbability
ModelsTenth EditionSheldon M. RossUniversity of Southern
CaliforniaLos Angeles, CAAMSTERDAM BOSTON HEIDELBERG LONDONNEW YORK
OXFORD PARIS SAN DIEGOSAN FRANCISCO SINGAPORE SYDNEY TOKYOAcademic
Press is an imprint of ElsevierAcademic Press is an imprint of
Elsevier30 Corporate Drive, Suite 400, Burlington, MA01803, USA525
B Street, Suite 1900, San Diego, California 92101-4495,
USAElsevier, The Boulevard, Langford Lane, Kidlington, Oxford, OX5
1GB, UKCopyright c 2010 Elsevier Inc. All rights reserved.No part
of this publication may be reproduced or transmitted in any form or
by any means, electronic ormechanical, including photocopying,
recording, or any information storage and retrieval system,
withoutpermission in writing from the publisher. Details on how to
seek permission, further information about thePublishers
permissions policies and our arrangements with organizations such
as the Copyright ClearanceCenter and the Copyright Licensing
Agency, can be found at our website:
www.elsevier.com/permissions.This book and the individual
contributions contained in it are protected under copyright by the
Publisher(other than as may be noted herein).NoticesKnowledge and
best practice in this eld are constantly changing. As new research
and experience broaden ourunderstanding, changes in research
methods, professional practices, or medical treatment may become
necessary.Practitioners and researchers must always rely on their
own experience and knowledge in evaluating andusing any
information, methods, compounds, or experiments described herein.
In using such information ormethods they should be mindful of their
own safety and the safety of others, including parties for whom
theyhave a professional responsibility.To the fullest extent of the
law, neither the Publisher nor the authors, contributors, or
editors, assume any liabilityfor any injury and/or damage to
persons or property as a matter of products liability, negligence
or otherwise, orfrom any use or operation of any methods, products,
instructions, or ideas contained in the material herein.ISBN:
978-0-12-381446-3For information on all Academic Press
publicationsvisit our Web site at www.elsevierdirect.comTypeset by:
diacriTech, India09 10 9 8 7 6 5 4 3 2 1ContentsChapter 1 . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4Chapter
2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. 7Chapter 3 . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . 12Chapter 4 . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . 20Chapter 5 . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . 26Chapter 6 . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . 34Chapter 7 . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . 39Chapter 8 . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
44Chapter 9 . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . 51Chapter 10 . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . 54Chapter 11 . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . 57Chapter 11. S = {(R, R), (R, G),
(R, B), (G, R), (G, G), (G, B),(B, R), (B, G), (B, B)}The
probability of each point in S is 1/9.3. S ={(e1, e2, , en), n 2}
where ei (heads, tails}.In addition, en=en1= heads and for i =1, ,n
2 if ei=heads, then ei+1= tails.P{4 tosses} =P{(t, t, h, h)} +
P{(h, t, h, h)}=2_12_4= 185. 34. If he wins, he only wins $1, while
if he loses, heloses $3.7. If (E F)coccurs, then E F does not
occur, and soE does not occur (and so Ecdoes); F does not occur(and
so Fcdoes) and thus Ecand Fcboth occur.Hence,(E F)c EcFcIf
EcFcoccurs, then Ecoccurs (and so E does not),and Fcoccurs (and so
F does not). Hence, neither Eor F occurs and thus (E F)cdoes.
Thus,EcFc (E F)cand the result follows.9. F = E FEc, implying since
E and FEcare disjointthat P(F) = P(E) + P(FE)c.11. P{sum is i} =i
136 , i = 2, , 713 i36 , i = 8, , 1213. Condition an initial
tossP{win} =12i=2P{win | throw i}P{throw i}Now,P{win| throw i} =
P{i before 7}=0 i = 2, 12i 15 + 1 i = 3, , 61 i = 7, 1113 i19 1 i =
8, , 10where above is obtained by using Problems 11and 12.P{win}
.49.17. Prob{end} =1 Prob{continue}=1 P({H, H, H} {T, T, T})=1
[Prob(H, H, H) + Prob(T, T, T)].Fair coin: Prob{end} =1 _12 12 12 +
12 12 12_= 34Biased coin: P{end} =1 _14 14 14 + 34 34 34_= 91619. E
=event at least 1 six P(E)= number of ways to get Enumber of sample
pts = 1136D=event two faces are different P(D)=1 Prob(two faces the
same)=1 636 = 56P(E|D) = P(ED)P(D) = 10/365/6 = 134Answers and
Solutions 521. Let C = event person is color blind.P(Male|C)=
P(C|Male) P(Male)P(C|Male P(Male) + P(C|Female) P(Female)= .05
.5.05 .5 + .0025 .5= 25002625 = 202123. P(E1)P(E2|E1)P(E3|E1E2)
P(En|E1 En1)= P(E1)P(E1E2)P(E1)P(E1E2E3)P(E1E2) P(E1 En)P(E1 En1)=
P(E1 En)25. (a) P{pair} =P{second card is samedenomination as
rst}=3/51(b) P{pair|different suits}= P{pair, different
suits}P{different suits}= P{pair}/P{different suits}= 3/5139/51 =
1/1327. P(E1) = 1P(E2|E1) = 39/51, since 12 cards are in the ace
ofspades pile and 39 are not.P(E3|E1E2) = 26/50, since 24 cards are
in the pilesof the two aces and 26 are in the other two
piles.P(E4|E1E2E3) = 13/49SoP{each pile has an ace} =
(39/51)(26/50)(13/49)29. (a) P(E|F) = 0(b) P(E|F) = P(EF)/P(F) =
P(E)/P(F) P(E) = .6(c) P(E|F) = P(EF)/P(F) = P(F)/P(F) = 131. Let S
= event sum of dice is 7; F = event rstdie is 6.P(S) = 16P(FS) =
136P(F|S) = P(F|S)P(S)= 1/361/6 = 1633. Let S = event student is
sophomore; F = eventstudent is freshman; B= event student is boy;G=
event student is girl. Let x = number ofsophomore girls; total
number of students =16 + x.P(F) = 1016 + xP(B) = 1016 + xP(FB) =
416 + x416 + x = P(FB) = P(F)P(B) = 1016 + x1016 + x x = 935. (a)
1/16(b) 1/16(c) 15/16, since the only way in which thepattern H, H,
H, H can appear before the pat-tern T, H, H, H is if the rst four
ips all landheads.37. Let W = event marble is white.P(B1|W) =
P(W|B1)P(B1)P(W|B1)P(B1) + P(W|B2)P(B2)=12 1212 12 + 13 12=14512=
3539. Let W = event woman resigns; A, B, C are eventsthe person
resigning works in store A, B, C, respec-tively.P(C|W)=
P(W|C)P(C)P(W|C)P(C) + P(W|B)P(B) + P(W|A)P(A)=.70 100225.70 100225
+ .60 75225 + .50 50225= 70225_140225 = 1241. Note rst that since
the rat has black parents anda brown sibling, we know that both its
parents arehybrids with one black and one brown gene (forif either
were a pure black then all their offspringwould be black). Hence,
both of their offspringsgenes are equally likely to be either black
or brown.(a) P(2 black genes | at least one black gene)= P(2 black
genes)P(at least one black gene)= 1/43/4 = 1/36 Answers and
Solutions(b) Using the result from part (a) yields thefollowing:P(2
black genes | 5 black offspring)= P(2 black genes)P(5 black
offspring)= 1/31(1/3) + (1/2)5(2/3)= 16/17where P(5 black
offspring) was computed by con-ditioning on whether the rat had 2
black genes.43. Let i = event coin was selected; P(H|i) =
i10.P(5|H) = P(H|5)P(5)10i=1P(H|i)P(i)=510 11010i=1110 110=
510i=1i= 11145. Let Bi= event ithball is black; Ri= event ithballis
red.P(B1|R2) = P(R2|B1)P(B1)P(R2|B1)P(B1) + P(R2|R1)P(R1)=rb + r +
c bb + rrb + r + c bb + r + r + cb + r + c rb + r= rbrb + (r + c)r=
bb + r + c47. 1. 0 P(A|B) 12. P(S|B) = P(SB)P(B) = P(B)P(B) = 13.
For disjoint events A and DP(AD|B) = P((AD)B)P(B)= P(AB DB)P(B)=
P(AB) + P(DB)P(B)=P(A|B) + P(D|B)Direct verication is as
follows:P(A|BC)P(C|B) + P(A|BCc)P(Cc|B)= P(ABC)P(BC)P(BC)P(B) +
P(ABCc)P(BCc)P(BCc)P(B)= P(ABC)P(B) + P(ABCc)P(B)= P(AB)P(B)=
P(A|B)Chapter 21. P{X = 0} =_72_ __102_= 14303. P{X = 2} = 14 = P{X
= 2}P{X = 0} = 125. P{max = 6} = 1136 = P{min = 1}P{max = 5} = 14 =
P{min = 2}P{max = 4} = 736 = P{min = 3}P{max = 3} = 536 = P{min =
4}P{max = 2} = 112 = P{min = 5}P{max = 1} = 136 = P{min = 6}7. p(0)
= (.3)3= .027p(1) = 3(.3)2(.7) = .189p(2) = 3(.3)(.7)2= .441p(3) =
(.7)3= .3439. p(0) = 12, p(1) = 110, p(2) = 15,p(3) = 110, p(3.5) =
11011. 3813.10i =7_10i__12_1015. P{X = k}P{X = k 1}=n!(n k)! k!pk(1
p)nkn!(n k + 1)!(k 1)!pk1(1 p)nk+1= n k + 1kp1 pHence,P{X = k}P{X =
k 1} 1 (n k + 1)p > k(1 p)(n + 1)p kThe result follows.17.
Follows since there are n!x1! xr! permutations of nobjects of which
x1 are alike, x2 are alike, , xr arealike.19. P{X1+ + Xk =
m}=_nm_(p1 + + pk)m(pk+1 + + pr)nm21. 1_ 310_55_ 310_4_ 710__52_ _
310_3_ 710_223. In order for X to equal n, the rst n 1 ips musthave
r 1 heads, and then the nthip must landheads. By independence the
desired probability isthus_n 1r 1_pr1(1 p)nrxp25. Atotal of 7 games
will be played if the rst 6 resultin 3 wins and 3 losses. Thus,P{7
games} =_63_p3(1 p)3Differentiation yields78 Answers and
SolutionsddpP{7} =20_3p2(1 p)3p33(1 p)2_=60p2(1 p)2_1 2pThus, the
derivative is zero when p = 1/2. Takingthe second derivative shows
that the maximum isattained at this value.27. P{same number of
heads} =iP{A = i, B = i}=i_ki_(1/2)k_n ki_(1/2)nk=i_ki__n
ki_(1/2)n=i_ kk i__n ki_(1/2)n=_nk_(1/2)nAnother argument is as
follows:P{# heads of A = # heads of B}= P{# tails of A = # heads of
B}since coin is fair= P{k # heads of A = # heads of B}= P{k = total
# heads}29. Each ip after the rst will, independently, resultin a
changeover with probability 1/2. Therefore,P{k changeovers}=_n
1k_(1/2)n133. c_ 11_1 x2_dx =1c_x x33_11=1c = 34F(y) = 34_ 11(1
x2)dx= 34_y y33 +23_, 1 < y < 135. P{X > 20}=_ 2010x2dx =
1237. P{M x} =P{max(X1, , Xn) x}=P{X1 x, , Xn x}=n
i=1P{Xi x}=xnfM(x) = ddxP{M x} = nxn139. E[X] =31641. Let Xi
equal 1 if a changeover results from the ithip and let it be 0
otherwise. Thennumber of changeovers =ni=2XiAs,E[Xi] =P{Xi = 1} =
P{ip i 1 = ip i}=2p(1 p)we see thatE[number of changeovers]
=ni=2E[Xi]=2(n 1)p(1 p)43. (a) X =ni=1Xi(b) E[Xi] = P{Xi=1}= P{red
ball i is chosen before all nblack balls}= 1/(n + 1) since each of
these n + 1balls is equally likely to be theone chosen
earliestTherefore,E[X] =ni=1E[Xi] = n/(n + 1)45. Let Ni denote the
number of keys in box i,i = 1, , k. Then, with X equal to the
numberof collisions we have that X =ki=1(Ni 1)+ =ki=1(Ni1 + I{Ni =
0}) where I{Ni = 0} is equalto 1 if Ni = 0 and is equal to 0
otherwise. Hence,Answers and Solutions 9E[X] =ki=1(rpi1 + (1 pi)r)
= r k+ki=1(1 pi)rAnother waytosolve this problemis tolet Ydenotethe
number of boxes having at least one key, andthen use the identity X
= r Y, which is true sinceonly the rst key put in each box does not
result ina collision. Writing Y =ki=1I{Ni > 0} and
takingexpectations yieldsE[X] = r E[Y] = r ki=1[1 (1 pi)r]= r k
+ki=1(1 pi)r47. Let Xi be 1 if trial i is a success and 0
otherwise.(a) The largest value is .6. If X1 = X2 = X3, then1.8 =
E[X] = 3E[X1] = 3P{X1 = 1}and soP{X = 3} = P{X1 = 1} = .6That this
is thelargest valueis seenbyMarkovsinequality, which yieldsP{X 3}
E[X]/3 = .6(b) The smallest value is 0. To construct a probabil-ity
scenario for which P{X = 3} = 0 let U be auniform random variable
on (0, 1), and deneX1 = 1 if U .60 otherwiseX2 = 1 if U .40
otherwiseX3 = 1 if either U .3 or U .70 otherwiseIt is easy to see
thatP{X1 = X2 = X3 = 1} = 049. E[X2] (E[X])2=Var(X) = E(X
E[X])20.Equality when Var(X) = 0, that is, when X isconstant.51. N
=ri=1Xj where Xi is the number of ips betweenthe (i 1)stand
ithhead. Hence, Xi is geometricwith mean 1/p. Thus,E[N] =ri=1E[Xi]
= rp53. 1n + 1, 12n + 1 _ 1n + 1_2.55. (a) P(Y = j)
=ji=0_ji_e2j/j!= e2 jj!ji=0_ji_1i1ji= e2 (2)jj!(b) P(X = i)
=j=i_ji_e2j/j!= 1i!e2j=i1( j i)!j= ii! e2k=0k/k!= eii!(c) P(X = i,
Y X = k) = P(X = i, Y = k + i)=_k + ii_e2 k+i(k + i)!= eii!
ekk!showing that X and Y X are independentPoissonrandomvariables
withmean. Hence,P(Y X = k) = ekk!57. It is the number of successes
in n + mindependentp-trials.59. (a) Use the fact that F(Xi) is a
uniform (0, 1) ran-dom variable to obtainp = P{F(X1) < F(X2)
> F(X3) < F(X4)}= P{U1 < U2 > U3 < U4}where the Ui,
i = 1, 2, 3, 4, are independentuniform (0, 1) random variables.10
Answers and Solutions(b) p =_ 10_ 1x1_ x20_ 1x3dx4dx3dx2dx1=_ 10_
1x1_ x20(1 x3)dx3dx2dx1=_ 10_ 1x1(x2x22/2)dx2dx1=_ 10(1/3 x21/2 +
x31/6)dx1=1/3 1/6 + 1/24 = 5/24(c) There are 5 (of the 24 possible)
orderings suchthat X1 < X2 > X3 < X4. They are as
follows:X2 > X4 > X3 > X1X2 > X4 > X1 > X3X2 >
X1 > X4 > X3X4 > X2 > X3 > X1X4 > X2 > X1 >
X361. (a) fX(x) =_ x2eydy=ex(b) fY(y) =_ y02eydx=2yey(c) Because
the Jacobian of the transformationx = x, w = y x is 1, we
havefX,W(x, w) = fX,Y(x, x + w) =2e(x+w)=exew(d) It follows from
the preceding that X andW are independent exponential random
vari-ables with rate .63. (t) =n=1etn(1 p)n1p=petn=1((1 p)et)n1=
pet1 (1 p)et65. Cov(Xi, Xj) =Cov(i +nk=1aikZk, j
+nt=1ajtZt)=nt=1nk=1Cov(ajkZk, ajtZt)=nt=1nk=1aikajtCov(Zk,
Zt)=nk=1aikajkwhere the last equality follows sinceCov(Zk, Zt) = 1
if k = t0 if k = t67. P{5 < X < 15} 2569. (1) _12_= .149871.
(a) P{X = i} =_ni_ _ mk i___n + mk_i = 0, 1,, min(k, n)(b) X
=ki=1XiE[X] =Ki=1E[Xi] = knn + msince the ithball is equally likely
to beeither of the n + m balls, and soE[Xi] =P{Xi = 1} = nn + mX
=ni=1YiE[X] =ni=1E[Yi]=ni=1P{ithwhite ball is selected}=ni=1kn + m
= nkn + m73. As Ni is a binomial random variable with para-meters
(n, Pi), we have (a) E[Ni] =nPji (b) Var(Xi) =nPi = (1 Pi); (c) for
i =j, the covariance of Ni andNj can be computed asCov (Ni, Nj) =
Cov_kXk,kYk_Answers and Solutions 11where Xk(Yk) is 1 or 0,
depending upon whether ornot outcome k is type i( j). Hence,Cov(Ni,
Nj) =k
Cov(Xk, Y
)Nowfor k = , Cov(Xk, Y
) = 0 by independence oftrials and soCov (Ni, Nj) =kCov(Xk,
Yk)=k(E[XkYk] E[Xk]E[Yk])=kE[Xk]E[Yk] (since XkYk =
0)=kPiPj=nPiPj(d) LettingYi =_1, if no type is occur0, otherwisewe
have that the number of outcomes that neveroccur is equal tor1Yi
and thus,E_ r1Yi_=r1E[Yi]=r1P{outcomes i does not occur}=r1(1
Pi)n75. (a) Knowing the values of N1, , Nj is equivalentto knowing
the relative ordering of the ele-ments a1, , aj. For instance, if
N1=0, N2 = 1,N3 = 1 then in the random permutation a2is before a3,
which is before a1. The indepen-dence result follows for clearly
the numberof a1,, ai that follow ai+1 does not proba-bilistically
depend on the relative ordering ofa1, , ai.(b) P{Ni = k} = 1i , k =
0, 1,, i 1which follows since of the elements a1, , ai+1the element
ai+1 is equally likely to be rst orsecond or or (i + 1)st.(c) E[Ni]
= 1ii1k=0k = i 12E[N2i ] = 1ii1k=0k2= (i 1)(2i 1)6and soVar(Ni) =
(i 1)(2i 1)6 (i 1)24= i211277. If g1(x, y) = x + y, g2(x, y) = x y,
thenJ =g1xg1yg2xg2y= 2Hence, if U = X + Y, V = X Y, thenfU, V(u, v)
= 12fX, Y_u + v2 , u v2_= 242 exp_ 122__u + v2 _2+_u v2 _2__=
e2/242 exp_u2 u242_exp_ v242_79. K
(t) = E_XetXE_etXK
(t) = E_etXE_X2etXE2_XetXE2_etXHence,K
(0) = E[X]K
(0) = E[X2] E2[X] = Var(X)Chapter 31. xpX|Y(x|y) =x p(x,
y)pY(y)=pY(y)pY(y)= 13. E[X|Y = 1] =2E[X|Y = 2] = 53E[X|Y = 3] =
1255. (a) P{X = i|Y = 3} = P{i white balls selectedwhen choosing 3
balls from 3 white and 6 red}=_3i_ _ 63 i__93_ , i = 0, 1, 2, 3(b)
By same reasoning as in (a), if Y = 1, thenX has the same
distribution as the numberof white balls chosen when 5 balls are
chosenfrom 3 white and 6 red. Hence,E[X|Y = 1] = 539 = 537. Given Y
=2, the conditional distribution of Xand Z isP{(X, Z) = (1, 1)|Y =
2} = 15P{(1, 2)|Y = 2} = 0P{(2, 1)|Y = 2} = 0P{(2, 2)|Y = 2} =
45So,E[X|Y =2] = 15 + 85 = 95E[X|Y =2, Z = 1] = 19. E[X|Y = y]
=xxP{X = x|Y = y}=xxP{X = x} by independence=E[X]11. E[X|Y = y] =
C_ yyx(y2x2)dx = 013. The conditional density of X given that X
> 1 isfX|X >1(x) = f (x)P{X > 1} = expxexp when x >
1E[X|X > 1] = exp_ 1x expxdx = 1 + 1/by integration by parts.15.
fX|Y =y(x|y) =1yexpyfy(y) =1yexpy_ y01yexpydx= 1y, 0 < x <
yE[X2|Y = y] = 1y_ y0x2dx = y2317. With K = 1/P{X = i}, we have
thatfY|X_y|i_=KP{X = i|Y = y}fY(y)=K1eyyieyya1=K1e(1+)yya+i1where
K1 does not depend on y. But as the pre-ceding is the density
function of a gamma randomvariable with parameters (s + i, 1 + )
the resultfollows.12Answers and Solutions 1319._ E[X|Y = y]
fY(y)dy=_ _ xfX|Y(x|y)dx fY(Y)dy=_ _ x f (x, y)fY(y) dx fY(y)dy=_
x_ f (x y)dydx=_ xfX(x)dx= E[X]21. (a) X =Ni=1Ti(b) Clearly N is
geometric with parameter 1/3;thus, E[N] = 3.(c) Since TN is the
travel time corresponding tothe choice leading to freedom it
follows thatTN = 2, and so E[TN] = 2.(d) Given that N=n, the travel
times Tii =1,,n 1 are each equally likely to be either 3
or5(sinceweknowthat adoor leadingbacktothenine is selected),
whereas Tn is equal to 2 (sincethat choice led to safety). Hence,E_
Ni=1Ti|N = n_=E_n1i=1Ti|N = n_+ E[Tn|N = n]=4(n 1) + 2(e) Since
part (d) is equivalent to the equationE_ Ni=1Ti|N_= 4N 2we see from
parts (a) and (b) thatE[X] =4E[N] 2=1023. Let X denote the rst time
a head appears. Let usobtain an equation for E[N|X] by conditioning
onthe next two ips after X. This givesE[N|X] =E[N|X, h, h]p2+
E[N|X, h, t]pq+ E[N|X, t, h]pq + E[N|X, t, t]q2where q = 1 p.
NowE[N|X, h, h] =X + 1, E[N|X, h, t] =X + 1E[N|X, t, h] =X + 2,
E[N|X, t, t] =X + 2 + E[N]Substituting back givesE[N|X] =(X +
1)(p2+ pq) + (X + 2)pq+ (X + 2 + E[N])q2Taking expectations, and
using the fact that X isgeometric with mean 1/p, we obtainE[N] = 1
+ p + q + 2pq + q2/p + 2q2+ q2E[N]Solving for E[N] yieldsE[N] = 2 +
2q + q2/p1 q225. (a) Let F be the initial outcome.E[N] =3i=1E[N|F =
i]pi =3i=1_1 + 2pi_pi=1 + 6 =7(b) Let N1,2 be the number of trials
until both out-come 1 and outcome 2 have occurred. ThenE[N1,2]
=E[N1,2|F = 1]p1 + E[N1,2|F = 2]p2+ E[N1,2|F = 3]p3=_1 + 1p2_p1 +_1
+ 1p1_p2+ (1 + E[N1,2])p3=1 + p1p2+ p2p1+ p3E[N1,2]Hence,E[N1,2] =1
+ p1p2 + p2p1p1 + p227. Condition on the outcome of the rst ip to
obtainE[X] =E[X|H]p + E[X|T](1 p)=(1 + E[X])p + E[X|T](1
p)Conditioning on the next ip givesE[X|T] =E[X|TH]p + E[X|TT](1
p)=(2 + E[X])p + (2 + 1/p)(1 p)where the nal equality follows since
given thatthe rst two ips are tails the number of additionalips is
just the number of ips needed to obtain ahead. Putting the
preceding together yieldsE[X] =(1 + E[X])p + (2 + E[X])p(1 p)+ (2 +
1/p)(1 p)2orE[X] = 1p(1 p)214 Answers and Solutions29. Let qi = 1
pi, i = 1.2. Also, let h stand for hit andm for miss.(a) 1=E[N|h]p1
+ E[N|m]q1=p1(E[N|h, h]p2 + E[N|h, m]q2)+ (1 + 2)q1=2p1p2 + (2 +
1)p1q2 + (1 + 2)q1The preceding equation simplies to1(1 p1q2) = 1 +
p1 + 2q1Similarly, we have that2(1 p2q1) = 1 + p2 + 1q2Solving
these equations gives the solution.h1=E[H|h]p1 + E[H|m]q1=p1(E[H|h,
h]p2 + E[H|h, m]q2) + h2q1=2p1p2 + (1 + h1) p1q2 + h2q1Similarly,
we have thath2 = 2p1p2 + (1 + h2)p2q1 + h1q2and we solve these
equations to nd h1and h2.31. Let Li denote the length of run i.
Conditioning onX, the initial value givesE[L1] =E[L1|X = 1]p +
E[L1|X = 0](1 p)= 11 pp + 1p(1 p)= p1 p + 1 ppandE[L2] =E[L2|X =
1]p + E[L2|X = 0](1 p)= 1pp + 11 p(1 p)=233. Let I(A) equal 1 if
the event Aoccurs andlet it equal0 otherwise.E_ Ti=1Ri_=E_i=1I(T
i)Ri_=i=1E[I (T i) Ri]=i=1E[I(T i)]E[Ri]=i=1P{T
i}E[Ri]=i=1i1E[Ri]=E_i=1i1Ri_35. np1=E[X1]=E[X1|X2 = 0](1 p2)n+
E[X1|X2 > 0][1 (1 p2)n]=n p11 p2(1 p2)n+ E[X1|X2 > 0][1 (1
p2)n]yielding the resultE[X1|X2 > 0] = np1(1 (1 p2)n1)1 (1
p2)n37. (a) E[X] = (2.6 + 3 + 3.4)/3 = 3(b) E[X2] =[2.6 + 2.62+ 3 +
9 + 3.4 + 3.42]/3=12.1067, and Var(X) = 3.106739. Let N denote the
number of cycles, and let X be theposition of card 1.(a) mn=
1nni=1E[N|X = i] = 1nni=1(1 + mn1)=1 + 1nn1j=1mj(b) m1=1m2=1 + 12 =
3/2m3=1 + 13(1 + 3/2) = 1 + 1/2 + 1/3=11/6m4=1 + 14(1 + 3/2 + 11/6)
= 25/12(c) mn = 1 + 1/2 + 1/3 + + 1/nAnswers and Solutions 15(d)
Using recursion and the induction hypothesisgivesmn=1 + 1nn1j=1(1 +
+ 1/j)=1 + 1n(n 1 + (n 2)/2 + (n 3)/3+ + 1/(n 1))=1 + 1n[n + n/2 +
+ n/(n 1)(n 1)]=1 + 1/2 + + 1/n(e) N =ni=1Xi(f) mn=ni=1E[Xi]
=ni=1P{i is last of 1,, i}=ni=11/i(g) Yes, knowing for instance
that i + 1 is the lastof all the cards 1, , i + 1 to be seen tells
usnothing about whether i is the last of 1, , i.(h) Var(N)
=ni=1Var(Xi) =ni=1(1/i)(1 1/i)41. Let N denote the number of
minutes in the maze.If L is the event the rat chooses its left, and
R theevent it chooses its right, we have by conditioningon the rst
direction chosen:E(N) = 12E(N|L) + 12E(N|R)= 12_13(2) + 23(5 +
E(N))_+ 12[3 + E(N)]= 56E(N) + 216=2143. E[T|2n] = 1_2n/nE[Z|2n] =
1_2n/nE[Z] = 0E[T2|2n] = n2nE[Z2|2n] = n2nE[Z2] = n2nHence, E[T] =
0, andVar(T) = E[T2] =E_ n2n_=n_ 01x12 ex/2(x/2)n21(n/2) dx=
n2(n/2)_ 012 ex/2(x/2)n22 1dx= n(n/2 1)2(n/2)= n2(n/2 1)= nn 245.
NowE[Xn|Xn1] = 0, Var(Xn|Xn1) = X2n1(a) From the above we see
thatE[Xn] = 0(b) From (a) we have that Var(xn) = E[X2n]. NowE[X2n]
=E{E[X2n|Xn1]}=E[X2n1]=E[X2n1]=2E[X2n2]=nX2047. E[X2Y2|X]
=X2E[Y2|X] X2(E[Y|X])2= X2The inequality following since for any
randomvariable U, E[U2] (E[U])2and this remains truewhen
conditioning on some other randomvariableX. Taking expectations of
the above shows thatE[(XY)2] E[X2]AsE[XY] = E[E[XY|X]] = E[XE[Y|X]]
= E[X]the result follows.49. Let A be the event that A is the
overall winner, andlet X be the number of games played. Let Y
equalthe number of wins for A in the rst two games.P(A) =P(A|Y =
0)P(Y = 0)+ P(A|Y = 1)P(Y = 1)+ P(A|Y = 2)P(Y = 2)=0 + P(A)2p(1 p)
+ p2Thus,P(A) = p21 2p(1 p)16 Answers and SolutionsE[X] =E[X|Y =
0]P(Y = 0)+ E[X|Y = 1]P(Y = 1)+ E[X|Y = 2]P(Y = 2)=2(1 p)2+ (2 +
E[X])2p(1 p) + 2p2=2 + E[X]2p(1 p)Thus,E[X] = 21 2p(1 p)51. Let be
the probability that X is even. Condition-ing on the rst trial
gives =P(even|X = 1)p + P(even|X > 1)(1 p)=(1 )(1 p)Thus, = 1 p2
pMore computationally =n=1P(X = 2n) = p1 pn=1(1 p)2n= p1 p(1 p)21
(1 p)2 = 1 p2 p53. P{X = n} =_ 0P{X = n|}ed=_ 0enn! ed=_ 0e2ndn!=_
0ettndtn!_12_n+1The result follows since_ 0ettndt = (n + 1) = n!57.
Let X be the number of storms.P{X 3} =1 P{X 2}=1 _ 50P{X 2| =
x}15dx=1 _ 50[ex+ xex+ exx2/2]15dx59. (a) P(AiAj) =nk=0P(AiAj|Ni =
k)_nk_pki(1 pi)nk=nk=1P(Aj|Ni = k)_nk_pki(1 pi)nk=n1k=1_1 _1 pj1
pi_nk__nk_pki(1 pi)nk=n1k=1_nk_pki(1 pi)nkn1k=1_1 pj1 pi_nk
_nk_pki(1 pi)nk=1 (1 pi)npni n1k=1_nk_pki(1 pipj)nk=1 (1 pi)npni
[(1 pj)n(1 pipj)npni ]=1 + (1 pipj)n(1 pi)n(1 pj)nwhere the
preceding used that conditional onNi = k, each of the other n k
trials indepen-dently results in outcome j with probabilitypj1
pi.(b) P(AiAj) =nk=1P(AiAj|Fi = k) pi(1 pi)k1+ P(AiAj|Fi > n) (1
pi)n=nk=1P(Aj|Fi = k) pi(1 pi)k1=nk=1_1 _1 pj1 pi_k1(1 pj)nk_pi(1
pi)k1(c) P(AiAj) =P(Ai) + P(Aj) P(AiAj)=1 (1 pi)n+ 1 (1 pj)n[1 (1
pipj)n]=1 + (1 pipj)n(1 pi)n(1 pj)n61. (a) m1= E[X|h]p1 + E[H|m]q1
= p1 + (1 + m2)q1= 1 + m2q1.Answers and Solutions 17Similarly, m2 =
1 + m1q2. Solving these equa-tions givesm1 = 1 + q11 q1q2, m2 = 1 +
q21 q1q2(b) P1 = p1 + q1P2P2 = q2P1implying thatP1 = p11 q1q2, P2 =
p1q21 q1q2(c) Let fi denote the probability that the nal hitwas by
1 when i shoots rst. Conditioning onthe outcome of the rst shot
givesf1 = p1P2 + q1f2 and f2 = p2P1 + q2f1Solving these equations
givesf1 = p1P2 + q1p2P11 q1q2(d) and (e) Let Bi denote the event
that both hitswere by i. Conditiononthe outcome of the rsttwo shots
to obtainP(B1) =p1q2P1 + q1q2P(B1) P(B1)= p1q2P11 q1q2Also,P(B2)
=q1p2(1 P1) + q1q2P(B2) P(B2)= q1p2(1 P1)1 q1q2(f) E[N] =2p1p2 +
p1q2(2 + m1)+ q1p2(2 + m1) + q1q2(2 + E[N])implying thatE[N] = 2 +
m1p1q2 + m1q1p21 q1q263. Let Si be the event there is only one type
i in thenal set.P{Si = 1} =n1j=0P{Si = 1|T = j}P{T = j}=
1nn1j=0P{Si = 1|T = j}= 1nn1j=01n jThe nal equality follows because
given that thereare still n j 1 uncollected types when the rsttype
i is obtained, the probability starting at thatpoint that it will
be the last of the set of n j typesconsisting of type i along with
the n j 1 yetuncollected types to be obtained is, by symmetry,1/(n
j). Hence,E_ ni=1Si_= nE[Si] =nk=11k65. (a) P{Yn = j} = 1/(n + 1),
j = 0, , n(b) For j = 0, , n 1P{Yn1 = j} =ni=01n + 1P{Yn1 = j|Yn =
i}= 1n + 1(P{Yn1 = j|Yn = j}+ P{Yn1 = j|Yn = j + 1})= 1n + 1(P(last
is nonred| j red)+ P(last is red| j + 1 red)= 1n + 1_n jn +j +
1n_=1/n(c) P{Yk = j} = 1/(k + 1), j = 0, , k(d) For j = 0, , k
1P{Yk1 = j} =ki=0P{Yk1 = j|Yk = i}P{Yk = i}= 1k + 1(P{Yk1 = j|Yk =
j}+ P{Yk1 = j|Yk = j + 1})= 1k + 1_k jk + j + 1k_= 1/kwhere the
second equality follows from theinduction hypothesis.67. A run of j
successive heads can occur in the fol-lowing mutually exclusive
ways: (i) either there isa run of j in the rst n 1 ips, or (ii)
there is noj-run in the rst n j 1 ips, ip n j is a tail,and the
next j ips are all heads. Consequently, (a)follows. Condition on
the time of the rst tail:Pj(n) =jk=1Pj(n k)pk1(.1 p) + pj, j n18
Answers and Solutions69. (a) Let I(i, j) equal 1 if i and j are a
pair and 0otherwise. ThenE_i a)it follows that if A is not always
leading thenthey will be tied at some point.(b) Consider any
outcome in which A receivesthe rst vote and they are eventually
tied,say a, a, b, a, b, a, b, b. We can correspond thissequence to
one that takes the part of thesequence until they are tied in the
reverseorder. That is, we correspond the above to thesequence b, b,
a, b, a, b, a, a where the remain-der of the sequence is exactly as
in the original.Note that this latter sequence is one in whichB is
initially ahead and then they are tied. Asit is easy to see that
this correspondence is oneto one, part (b) follows.(c) Now,P{B
receives rst vote and they areeventually tied}= P{B receives rst
vote}= n/(n + m)Therefore, by part (b) we see thatP{eventually
tied}= 2n/(n + m)and the result follows from part (a).77. We will
prove it when X and Y are discrete.(a) This part follows from (b)
by takingg(x, y) = xy.(b) E[g(X, Y)|Y = y] =yxg(x, y)P{X = x, Y =
y|Y = y}Now,P{X = x, Y = y|Y = y}=0, if y = yP{X = x, Y = y}, if y
= ySo,E_g(X, Y)|Y = y=kg(x, y)P{X=x|Y =y}=E[g(x, y)|Y = y(c) E[XY]
=E[E[XY|Y]]=E[YE[X|Y]] by (a)79. Let us suppose we take a picture
of the urn beforeeach removal of a ball. If at the end of the
exper-iment we look at these pictures in reverse order(i.e., look
at the last taken picture rst), we willsee a set of balls
increasing at each picture. Theset of balls seen in this fashion
always will havemore white balls than black balls if and only if
inthe original experiment there were always morewhite than black
balls left in the urn. Therefore,these two events must have same
probability, i.e.,n m/n + m by the ballot problem.81. (a) f (x) =
E[N] =_ 10E[N|X1 = y]dyE[N|X1 = y] =_1 if y < x1 + f (y) if y
> xHence,f (x) = 1 +_ 1xf (y)dy(b) f
(x) = f (x)(c) f (x) = cex. Since f (1) = 1, we obtain thatc =
e, and so f (x) = e1x.(d) P{N > n} = P{x < X1 < X2 <
< Xn} =(1 x)n/n! since inorder for the above event tooccur all
of thenrandomvariables must exceedx (and the probability of this is
(1 x)n), andthen among all of the n! equally likely order-ings of
this variables the one in which they areincreasingmust occur.(e)
E[N] =n=0P{N > n}=n(1 x)n/n! = e1x83. Let Ij equal 1 if ball j
is drawn before ball i andlet it equal 0 otherwise. Then the random
variableof interest is j =iIj. Now, by considering the rstAnswers
and Solutions 19time that either i or j is withdrawn we see thatP{
j before i} = wj/(wi + wj). Hence,E_j=iIj_=j=iwjwi + wj85. Consider
the following ordering:e1, e2, , el1, i, j, el+1, , en where Pi
< PjWe will show that we can do better by inter-changing the
order of i and j, i.e., by takinge1, e2, , el1, j, i, el+2, , en.
For the rst ordering,the expected position of the element requested
isEi,j=Pe1 + 2Pe2 + + (l 1)Pel1+ lpi + (l + 1)Pj + (l + 2)Pel+2 +
Therefore,Ei,jEj,i=l(PiPj) + (l + 1)(PjPi)=PjPi > 0and so the
second ordering is better. This showsthat every ordering for which
the probabilities arenot in decreasing order is not optimal in the
sensethat we can do better. Since there are only a nitenumber of
possible orderings, the ordering forwhich p1 p2 p3 pn is
optimum.87. (a) This can be proved by induction on m. It isobvious
when m=1 and then by xing thevalue of x1 and using the induction
hypothe-sis, we see that there areni=0_n i + m2m2_such solutions.
As_n i + m2m2_ equals thenumber of ways of choosing m1 items froma
set of size n + m 1 under the constraintthat the lowest numbered
item selected isnumber i + 1 (that is, none of 1, , i areselected
where i + 1 is), we see thatni=0_n i + m2m2_=_n + m1m1_It also can
be proven by noting that each solu-tion corresponds in a one-to-one
fashion witha permutation of n ones and (m 1) zeros.The
correspondence being that x1 equals thenumber of ones to the left
of the rst zero, x2the number of ones between the rst and sec-ond
zeros, and so on. As there are (n + m1)!/n!(m 1)! such
permutations, the resultfollows.(b) The number of positive
solutions of x1 + +xm = n is equal to the number of
nonnegativesolutions of y1 + + ym = n m, and thusthere are_n
1m1_such solutions.(c) If we x a set of k of the xi and require
themto be the only zeros, then there are by (b)(with m replaced by
mk)n 1mk 1 suchsolutions. Hence, there aremkn 1mk 1outcomes such
that exactly k of the Xi areequal to zero, and so the desired
probabilityismkn 1mk 1_n + m1m1.89. Condition on the value of In.
This givesPn(K) =P_ nj=1jIj K|In = 1_1/2+ P_ nj=1jIj K|In =
0_1/2=P_n1j=1jIj + n K_1/2+ P_n1j=1jIj K_1/2=[Pn1(k n) +
Pn1(K)]/291. 1p5(1 p)3 + 1p2(1 p) + 1p95. With = P(Sn < 0 for
all n > 0), we haveE[X] = = p1Chapter 41. P01 = 1, P10 = 19, P21
= 49, P32 = 1P11 = 49, P22 = 49P12 = 49, P23 = 193.(RRR) (RRD)
(RDR) (RDD) (DRR) (DRD) (DDR) (DDD)(RRR) .8 .2 0 0 0 0 0 0(RRD) .4
.6(RDR) .6 .4(RDD) .4 .6P = (DRR) .6 .4(DRD) .4 .6(DDR) .6 .4(DDD)
.2 .8where D = dry and R = rain. For instance, (DDR)means that it
is raining today, was dry yesterday,and was dry the day before
yesterday.5. Cubingthe transitionprobabilitymatrix, we
obtainP3:13/36 11/54 47/1084/9 4/27 11/275/12 2/9 13/36Thus,E[X3]
=P(X3 = 1) + 2P(X3 = 2)= 14P301 + 14P311 + 12P321+ 2_14P302 +
14P312 + 12P322_7. P230 + P231=P31P10 + P33P11 + P33P31=(.2)(.5) +
(.8)(0) + (.2)(0) + (.8)(.2)=.269. It is not a Markov chain because
information aboutprevious color selections would affect
probabili-ties about the current makeup of the urn, whichwould
affect the probability that the next selectionis red.11. The answer
is P42, 21 P42, 0for the Markov chain withtransition probability
matrix1 0 0.3 .4 .3.2 .3 .513. Pnij =kPnrik Prkj > 015. Consider
any path of states i0 = i, i1, i2, , in = jsuch that Pikik+1 >
0. Call this a path from i to j.If j can be reached from i, then
there must be apath from i to j. Let i0, , in be such a path. If
allof the values i0, , in are not distinct, then thereis a subpath
from i to j having fewer elements (forinstance, if i, 1, 2, 4, 1,
3, j is a path, thensois i, 1, 3, j).Hence, if a path exists, there
must be one with alldistinct states.17.ni=1Yj/n E[Y] by the strong
law of large num-bers. Now E[Y] = 2p 1. Hence, if p > 1/2,
thenE[Y] > 0, and so the average of the Yis convergesin this
case to a positive number, which impliesthatn1Yi as n . Hence,
state 0 can bevisited only a nite number of times and so mustbe
transient. Similarly, if p < 1/2, then E[Y] < 0,and so
limn1Yi = , and the argument issimilar.19. The limiting
probabilities are obtained fromr0=.7r0 + .5r1r1=.4r2 + .2r3r2=.3r0
+ .5r1r0+r1 + r2 + r3 = 1and the solution isr0 = 14, r1 = 320, r2 =
320, r3 = 92020Answers and Solutions 21The desired result is thusr0
+ r1 = 2521. The transition probabilities arePi, j =_1 3, if j = i,
if j = iBy symmetry,Pnij = 13(1 Pnii), j = iSo, let us prove by
induction thatPni, j =14 + 34(1 4)n, if j = i14 14(1 4)n, if j =
iAs the preceding is true for n = 1, assume it for n.To complete
the induction proof, we need to showthatPn+1i, j =14 + 34(1 4)n+1,
if j = i14 14(1 4)n+1, if j = iNow,Pn+1i, i =Pni, i Pi, i +j=iPni,
j Pj, i=_14 + 34(1 4)n_(1 3)+ 3_14 14(1 4)n_= 14 + 34(1 4)n(1 3 )=
14 + 34(1 4)n+1By symmetry, for j = iPn+1ij = 13_1 Pn+1ii_= 14 14(1
4)n+1and the induction is complete.By letting n in the preceding,
or by using thatthe transition probability matrix is doubly
stochas-tic, or by just using a symmetry argument, weobtain that i
= 1/4.23. (a) Letting 0 stand for a good year and 1 for a badyear,
the successive states follow a Markov chainwith transition
probability matrix P:_1/2 1/21/3 2/3_Squaring this matrix gives
P2:_5/12 7/127/18 11/18_Hence, if Si is the number of storms in
year i thenE[S1] = E[S1|X1 = 0]P00 + E[S1|X1 = 1]P01= 1/2 + 3/2 =
2E[S2] = E[S2|X2 = 0]P200 + E[S2|X2 = 1]P201= 5/12 + 21/12 =
26/12Hence, E[S1 + S2] = 25/6.(b) Multiplyingthe rst rowof Pbythe
rst columnof P2givesP300 = 5/24 + 7/36 = 29/72Hence, conditioning
on the state at time 3 yieldsP(S3=0) = P(S3=0|X3=0)
2972+P(S3=0|X3=1) 4372 = 2972e1+ 4372e3(c) The stationary
probabilities are the solution of0 = 012 + 1130 + 1 = 1giving0 =
2/5 , 1 = 3/5.Hence, the long-run average number of storms is2/5 +
3(3/5) = 11/5.25. Letting Xn denote the number of pairs of shoesat
the door the runner departs from at the begin-ning of day n, then
{Xn} is a Markov chain withtransition probabilitiesPi, i=1/4, 0
< i < kPi, i1=1/4, 0 < i < kPi, ki=1/4, 0 < i <
kPi, ki+1=1/4, 0 < i < kThe rst equation refers to the
situation where therunner returns to the same door she left from
andthen chooses that door the next day; the second tothe situation
where the runner returns to the oppo-site door fromwhichshe left
fromandthenchoosesthe original door the next day; and so on.
(Whensome of the four cases above refer to the same tran-sition
probability, they should be added together.For instance, if i = 4,
k = 8, then the preceding22 Answers and Solutionsstates that Pi, i
= 1/4 = Pi, ki. Thus, in this case,P4, 4 = 1/2.) Also,P0, 0= 1/2P0,
k = 1/2Pk, k = 1/4Pk, 0= 1/4Pk, 1= 1/4Pk, k1= 1/4It is now easy to
check that this Markov chain isdoubly stochasticthat is, the column
sums of thetransition probability matrix are all 1and so
thelong-run proportions are equal. Hence, the propor-tionof time
the runner runs barefootedis 1/(k + 1).27. The limiting
probabilities are obtained fromr0 = 19r1r1 = r0 + 49r1 + 49r2r2 =
49r1 + 49r2 + r3r0 + r1 + r2 + r3 = 1and the solution is r0 = r3 =
120, r1 = r2 = 920.29. Eachemployee moves accordingtoa
Markovchainwhose limiting probabilities are the solution of
1 = .7
1 + .2
2 + .1
3
2 = .2
1 + .6
2 + .4
3
1 +
2 +
3 = 1Solving yields
1 = 6/17,
2 = 7/17,
3 =4/17. Hence, if N is large, it follows from the lawof large
numbers that approximately 6, 7, and 4 ofeach 17 employees are in
categories 1, 2, and 3.31. Let the state onday n be 0 if sunny, 1
if cloudy, and2if rainy. This gives a three-state Markov chain
withtransition probability matrix0 1 20 0 1/2 1/2P = 1 1/4 1/2 1/42
1/4 1/4 1/2The equations for the long-run proportions arer0 = 14 r1
+ 14 r2r1 = 12 r0 + 12 r1 + 14 r2r2 = 12 r0 + 14 r1 + 12 r2r0 + r1
+ r2 = 1By symmetry it is easy to see that r1 = r2. Thismakes it
easy to solve and we obtain the resultr0 = 15, r1 = 25, r2 = 2533.
Consider the Markov chain whose state at time n isthe type of exam
number n. The transition proba-bilities of this Markov chain are
obtained by condi-tioning on the performance of the class. This
givesthe following:P11 = .3(1/3) + .7(1) = .8P12 = P13 = .3(1/3) =
.1P21 = .6(1/3) + .4(1) = .6P22 = P23 = .6(1/3) = .2P31 = .9(1/3) +
.1(1) = .4P32 = P33 = .9(1/3) = .3Let ri denote the proportion of
exams that are typei, i = 1, 2, 3. The ri are the solutions of the
followingset of linear equations:r1 = .8 r1 + .6 r2 + .4 r3r2 = .1
r1 + .2 r2 + .3 r3r1 + r2 + r3 = 1Since Pi2=Pi3 for all states i,
it follows thatr2 = r3. Solving the equations gives the solutionr1
= 5/7, r2 = r3 = 1/735. The equations arer0=r1 + 12 r2 + 13 r3 + 14
r4r1= 12 r2 + 13 r3 + 14 r4r2= 13 r3 + 14 r4r3= 14 r4r4=r0r0+r1 +
r2 + r3 + r4 = 1Answers and Solutions 23The solution isr0 = r4 =
12/37, r1 = 6/37, r2 = 4/37,r3 = 3/3737. Must show thatj =iiPki,
jThe preceding follows because the right-hand sideis equal to the
probability that the Markov chainwith transition probabilities Pi,
j will be in state jat time k when its initial state is chosen
accordingto its stationary probabilities, which is equal to
itsstationary probability of being in state j.39. Because
recurrence is a class property it followsthat state j, which
communicates with the recur-rent state i, is recurrent. But if j
were positive recur-rent, then by the previous exercise i would be
aswell. Because i is not, we can conclude that j is
nullrecurrent.41. (a) The number of transitions into state i by
timen, the number of transitions originating fromstate i by time n,
and the number of time peri-ods the chain is in state i by time n
all differby at most 1. Thus, their long-run proportionsmust be
equal.(b) riPij is the long-run proportion of transitionsthat go
from state i to state j.(c) j riPij is the long-run proportion of
transi-tions that are into state j.(d) Since rj is also the
long-run proportion of tran-sitions that are into state j, it
follows thatrj =jriPij43. Consider a typical statesay, 1 2 3. We
must show
123 =
123P123, 123 +
213P213, 123+
231P231, 123Now P123, 123 = P213, 123 = P231, 123 = P1 and
thus,
123 = P1_
123 +
213 +
231_We must show that
123 = P1P21 P1,
213 = P2P11 P2,
231 = P2P31 P2satises the above, which is equivalent toP1P2=P1_
P2P11 P2+ P2P31 P2_= P11 P2P2(P1 + P3)=P1P2 since P1 + P3 = 1 P2By
symmetry all of the other stationary equationsalso follow.45. (a)
1, since all states communicate and thus all arerecurrent since
state space is nite.(b) Condition on the rst state visited from
i.xi =N1j=1Pijxj + PiN, i = 1, , N 1x0=0, xN = 1(c) Must showiN
=N1j=1jNPij + PiN=Nj=0jNPijand follows by hypothesis.47. {Yn, n 1}
is a Markov chain with states (i, j).P(i, j),(k, ) =_0, if j = kPj,
if j = kwhere Pj is the transition probability for {Xn}.limn P{Yn =
(i, j)} = limn P{Xn = i, Xn+1 = j}= limn [P{Xn = i}Pij]= riPij49.
(a) No.lim P{Xn = i} = pr1(i) + (1 p)r2(i)(b) Yes.Pij = pP(1)ij +
(1 p)P(2)ij53. With i(1/4) equal to the proportion of timea
policyholder whose yearly number of acci-dents is Poisson
distributed with mean 1/4 is inBonus-Malus state i, we have that
the average pre-mium is23(326.375) + 13[2001(1/4) + 2502(1/4)+
4003(1/4) + 6004(1/4)]24 Answers and Solutions55. S11=P{offspring
is aa | both parents dominant}= P{aa, both dominant}P{both
dominant}=r2 14(1 q)2 = r24(1 q)2S10= P{aa, 1 dominant and 1
recessive parent}P{1 dominant and 1 recessive parent}= P{aa, 1
parent aAand 1 parent aa}2q(1 q)=2qr 122q(1 q)= r2(1 q)57. Let A be
the event that all states have been visitedby time T. Then,
conditioning on the direction ofthe rst step givesP(A)
=P(A|clockwise)p+ P(A|counterclockwise)q=p 1 q/p1 (q/p)n + q 1 p/q1
(p/q)nThe conditional probabilities in the precedingfollow by
noting that they are equal to the proba-bility in the gamblers ruin
problemthat a gamblerthat starts with 1 will reach n before going
brokewhen the gamblers win probabilities are p and q.59. Condition
on the outcome of the initial play.61. With P0 = 0, PN = 1Pi =
iPi+1 + (1 i)Pi1, i = 1, , N 1These latter equations can be
rewritten asPi+1Pi = i(PiPi1)where i = (1 i)/i. These equations can
nowbe solved exactly as in the original gamblers ruinproblem. They
give the solutionPi =1 +i1j=1 Cj1 +N1j=1 Cj, i = 1, , N 1whereCj
=j
i=1i(c) PNi, where i = (N i)/N65. r 0 = P{X0 = 0}. Assume thatr
P{Xn1 = 0}P{Xn = 0 =jP{Xn = 0|X1 = j}Pj=j_P{Xn1 = }jPjjrjPj= r67.
(a) Yes, the next state depends only on the presentand not on the
past.(b) One class, period is 1, recurrent.(c) Pi, i+1 = PN iN , i
= 0, 1, , N 1Pi, i1 = (1 P) iN, i = 1, 2, , NPi, i = P iN + (1 p)(N
i)N , i = 0, 1, , N(d) See (e).(e) ri =_Ni_pi(1 p)Ni, i = 0, 1,,
N(f) Direct substitution or use Example 7a.(g) Time =N1j=iTj, where
Tj is the number ofips to go from j to j + 1 heads. Tj is
geo-metric with E[Tj] = N/j. Thus, E[time] =N1j=iN/j.69. r(n1,, nm)
= M!n1,, nm!_1m_MWe must now show thatr(n1,, ni1,, nj + 1,)nj +
1M1M1= r(n1,, ni,, nj,) iM1M1or nj + 1(ni1)!(nj + 1)! = nini!nj!,
which follows.71. If rj =cPijPji, thenrjPjk
=cPijPjkPjirkPkj=cPjkPkjPkiand are thus equal by hypothesis.Answers
and Solutions 2573. It is straightforward to check that riPij =
rjPji. Forinstance, consider states 0 and 1. Thenr0p01 = (1/5)(1/2)
= 1/10whereasr1p10 = (2/5)(1/4) = 1/1075. The number of transitions
fromi to j in any intervalmust equal (towithin1) the number fromj
toi sinceeach time the process goes fromi to j in order to getback
to i, it must enter from j.77. (a) ayja=aE_nanI{Xn =j, an
=a}_=E_nanaI{Xn =j, an =a}_=E_nanI{Xn =j}_(b) jayja=E_nanjI{Xn
=j}_=E_an_= 11 ayja= bj + E_ n=1= anI{Xn =j}_= bj + E_
n=0an+1I{Xn+1=j}_= bj + E_ n=0= an+1i, aI{Xn =i, an=a}I(Xn+1=j}_=
bj +n=0an+1i, aE_I{Xn =i, an=a}_Pij(a)= bj + ai, ananE_I(Xn=i,
an=a}_Pij(a)= bj + ai, ayiaPij(a)(c) Let dj, a denote the expected
discounted timethe process is in j, and a is chosen when policy is
employed. Then by the same argument asin (b):adja= bj+ai,
ananE[I{Xn=i, an=a}] Pij(a)= bj + ai, ananE_I{Xn= i}_
yiaayiaPij(a)= bj + ai, aadia,yiaayiaPij(a)and we see from Equation
(9.1) that the aboveis satised upon substitution of dia = yia. Asit
is easy to see thati,adia = 11 a, the resultfollows since it can be
shown that these linearequations have a unique solution.(d) Follows
immediately from previous parts.It is a well-know result in
analysis (andeasily proven) that if limnan/n = a thenlimn ni ai/n
also equals a. The result fol-lows from this sinceE[R(Xn)] =jR(
j)P{Xn = j}=iR( j)rjChapter 51. (a) e1(b) e13. The conditional
distribution of X, given thatX>1, is the same as the
unconditional distributionof 1 + X. Hence, (a) is correct.5. e1by
lack of memory.7. P{X1 < X2| min(X1, X2) = t}= P{X1 < X2,
min(X1, X2) = t}P{min(X1, X2) = t}= P{X1 = t, X2 > t}P{X1 = t,
X2 > t} + P{X2 = t, X1 > t}= f1(t)F2(t)f1(t)F2(t) +
f2(t)F1(t)Dividing though by F1(t)F2(t) yields the result.(For a
more rigorous argument, replace = tby (t, t + ) throughout, and
then let 0.)9. Condition on whether machine 1 is still working
attime t, to obtain the answer,1 e1t+ e1t 11 + 211. (a)
UsingEquation(5.5), the lackof memoryprop-erty of the exponential,
as well as the fact thatthe minimum of independent exponentials
isexponential with a rate equal to the sum oftheir individual
rates, it follows thatP(A1) = n + nand, for j > 1,P(Aj|A1 Aj1) =
(n j + 1) + (n j + 1)Hence,p =n
j=1(n j + 1) + (n j + 1)(b) When n = 2,P{max Yi < X}=_ 0P{max
Yi < X|X = x}exdx=_ 0P{max Yi < x}exdx=_ 0(1 ex)2exdx=_ 0(1
2ex+ e2x)2exdx= 1 2 + + 2 + = 22( + )( + 2)13. Let Tn denote the
time until the nthperson in linedeparts the line. Also, let Dbe the
time until the rstdeparture fromthe line, and let X be the
additionaltime after D until Tn. Then,E[Tn] =E[D] + E[X]= 1n + + (n
1) + n + E[Tn1]where E[X] was computed by conditioning onwhether
the rst departure was the person in line.Hence,E[Tn] = An +
BnE[Tn1]whereAn = 1n + , Bn = (n 1) + n + Solving gives the
solutionE[Tn] =An +n1i=1Anin
j=ni+1Bj=An +n1i=11/(n + )= nn + 26Answers and Solutions
27Another way to solve the preceding is to let Ij equal1 if
customer n is still in line at the time of the ( j 1)stdeparture
from the line, and let Xj denote thetime between the ( j 1)stand
jthdeparture fromline. (Of course, these departures only refer to
therst n people in line.) ThenTn =nj=1IjXjThe independence of Ij
and Xj givesE[Tn] =nj=1E[Ij]E[Xj]But,E[Ij] = (n 1) + n + (n j + 1)
+ (n j + 2) + = (n j + 1) + n + andE[Xj] = 1(n j + 1) + which gives
the result.15. Let Ti denote the time between the (i 1)thandthe
ithfailure. Then the Ti are independent with Tibeing exponential
with rate (101 i)/200. Thus,E[T] =5i=1E[Ti] =5i=1200101 iVar(T)
=5i=1Var(Ti) =5i=1(200)2(101 i)217. Let Ci denote the cost of the
ithlink to beconstructed, i =1, , n 1. Note that the rstlink can be
any of the_n2_ possible links.Given the rst one, the second link
must connectone of the 2 cities joined by the rst link with one
ofthe n 2 cities without any links. Thus, given therst constructed
link, the next link constructed willbe one of 2(n2) possible links.
Similarly, giventherst two links that are constructed, the next one
tobe constructedwill be one of 3(n3) possible links,and so on.
Since the cost of the rst link to be builtis the minimum of_n2_
exponentials with rate 1,it follows thatE[C1] = 1__n2_By the lack
of memory property of the exponentialit follows that the amounts by
which the costs ofthe other links exceedC1are independent
exponen-tials with rate 1. Therefore, C2 is equal to C1 plusthe
minimumof 2(n2) independent exponentialswith rate 1, and soE[C2] =
E[C1] + 12(n 2)Similar reasoning then givesE[C3] = E[C2] + 13(n
3)and so on.19. (c) Letting A = X(2)X(1) we haveE[X(2)]= E[X(1)] +
E[A]= 11 + 2+ 1211 + 2+ 1121 + 2The formula for E[A] being obtained
by condi-tioning on which Xi is largest.(d) Let I equal 1 if X1
1.(d) T is the sumof n1 independent exponentialswith rate 2 (since
each time a failure occursthe time until the next failure is
exponentialwith rate 2).(e) Gamma with parameters n 1 and 2.25.
Parts (a) and (b) follow upon integration. For part(c), condition
on which of X or Y is larger and usethe lack of memory property to
conclude that theamount by which it is larger is exponential rate
.For instance, for x < 0,fx y(x)dx= P{X < Y}P{x < Y X <
x + dx|Y > X}= 12exdxFor (d) and (e), condition on I.27. (a) 11
+ 3(b) 11 + 322 + 3(c) i1i+ 11 + 322 + 313(d) i1i+ 11 + 2_ 12+ 22 +
313_+ 21 + 211 + 322 + 31329. (a) fX|X +Y(x|c)=CfX. X+Y(x,
c)=C1fXY(x, cx)=fX(x) fY(c x)=C2exe(cx), 0 < x < c=C3e()x, 0
< x < cwhere none of the Ci depend on x. Hence, wecan
conclude that the conditional distributionis that of an exponential
random variable con-ditioned to be less than c.(b) E[X|X + Y = c] =
1 e()c(1 + ( )c)(1 e()c)(c) c = E[X + Y|X + Y = c] = E[X|X + Y =
c]+ E[Y|X + Y = c]implying thatE[Y|X + Y = c]= c 1 e()c(1 + ( )c)(1
e()c)31. Conditiononwhether the 1 PMappointment is stillwiththe
doctor at 1:30, anduse the fact that if she orhe is then the
remaining time spent is exponentialwith mean 30. This givesE[time
spent in ofce]= 30(1 e30/30) + (30 + 30)e30/30= 30 + 30e133. (a) By
the lack of memory property, no matterwhen Y fails the remaining
life of X is expo-nential with rate .(b) E[min (X, Y) |X > Y +
c]= E[min (X, Y) |X > Y, X Y > c]= E[min (X, Y) |X >
Y]where the nal equality follows from (a).37. 1 + 139. (a) 196/2.5
= 78.4(b) 196/(2.5)2= 31.36We use the central limit theorem to
justify approx-imating the life distribution by a normal
distri-bution with mean 78.4 and standard deviation31.36 = 5.6. In
the following, Z is a standard nor-mal random variable.(c) P{L <
67.2} P_Z < 67.2 78.45.6_=P{Z < 2} = .0227(d) P{L > 90}
P_Z > 90 78.45.6_=P{Z > 2.07} = .0192(e) P{L > 100} P_Z
> 100 78.45.6_=P{Z > 3.857} = .00006Answers and Solutions
2941. 1/(1 + 2)43. Let Si denote the service time at server i, i =
1, 2 andlet X denote the time until the next arrival. Then,with p
denoting the proportion of customers thatare served by both
servers, we havep = P{X > S1 + S2}= P{X > S1}PX > S1 +
S2|X > S1}= 11 + 22 + 45. E[N(T)] =E[E[N(T)|T]] = E[T] =
E[T]E[TN(T)] =E[E[TN(T)|T]] = E[TT] = E[T2]E[N2(T)] =E_E[N2(T)|T]_=
E[T + (T)2]=E[T] + 2E[T2]Hence,Cov(T, N(T)) = E[T2] E[T]E[T] =
2andVar(N(T)) =E[T] + 2E[T2] (E[T])2= + 2247. (a) 1_(2) + 1/(b) Let
Ti denote the time until both servers arebusy when you start with i
busy servers i =0, 1. Then,E[T0] = 1/ + E[T1]Now, starting with 1
server busy, let T be thetime until the rst event (arrival or
departure);let X = 1 if the rst event is an arrival and let itbe 0
if it is a departure; let Y be the additionaltime after the rst
event until both servers arebusy.E[T1] =E[T] + E[Y]= 1 + + E[Y|X =
1] + + E[Y|X = 0] + = 1 + + E[T0] + Thus,E[T0] 1 = 1 + + E[T0] +
orE[T0] = 2 + 2Also,E[T1] = + 2(c) Let Li denote the time until a
customer is lostwhen you start with i busy servers. Then,reasoning
as in part (b) gives thatE[L2] = 1 + + E[L1] + = 1 + + (E[T1] +
E[L2]) + = 1 + + 2 + E[L2] + Thus,E[L2] = 1 + ( + )349. (a) P{N(T)
N(s) = 1} = (T s)e(Ts)(b) Differentiating the expression in part
(a) andthen setting it equal to 0 givese(Ts)= (T s)e(Ts)implying
that the maximizing value iss = T 1/(c) For s = T 1/, we have that
(T s) = 1 andthus,P{N(T) N(s) = 1} = e151. Condition on X, the time
of the rst accident, toobtainE[N(t] =_ 0E[N(t)|X = s]esds=_ t0(1 +
(t s))esds53. (a) e1(b) e1+ e1(.8)e155. As long as customers are
present to be served,every event (arrival or departure) will,
inde-pendently of other events, be a departure withprobability p =
/( + ). Thus P{X=m} is theprobability that there have been a total
of mtails atthemoment that thenthheadoccurs, whenindepen-dent ips
of a coin having probability p of comingup heads are made: that is,
it is the probability thatthe nthhead occurs on trial number n + m.
Hence,p{X = m} =_n + m1n 1_pn(1 p)m30 Answers and Solutions57. (a)
e2(b) 2 p.m.59. The unconditional probabilitythat the claimis type1
is 10/11. Therefore,P(1|4000) = P(4000|1)P(1)P(4000|1)P(1) +
P(4000|2)P(2)= e410/11e410/11 + .2e.81/1161. (a) Poisson with mean
cG(t).(b) Poisson with mean c[1 G(t)].(c) Independent.63. Let X and
Y be respectively the number of cus-tomers in the systemat time t +
s that were presentat time s, and the number in the system at t +
sthat were not in the system at time s. Since thereare an innite
number of servers, it follows thatX and Y are independent (even if
given the num-ber is the system at time s). Since the service
dis-tribution is exponential with rate , it follows thatgiventhat
X(s) = n, Xwill be binomial withparam-eters n and p = et. Also Y,
which is indepen-dent of X(s), will have the same distributionas
X(t).Therefore, Y is Poisson with mean t_0eydy= (1 et)/(a) E[X(t +
s)|X(s) = n]= E[X|X(s) = n] + E[Y|X(s) = n].= net+ (1 et)/(b)
Var(X(t + s)|X(s) = n)= Var(X + Y|X(s) = n)= Var(X|X(s) = n) +
Var(Y)= net(1 et) + (1 et)/The above equation uses the formulas for
thevariances of a binomial and a Poisson randomvariable.(c)
Consider an innite server queuing system inwhich customers arrive
according to a Poissonprocess with rate , and where the
servicetimes are all exponential random variableswith rate . If
there is currently a single cus-tomer in the system, nd the
probability thatthe system becomes empty when that cus-tomer
departs.Condition on R, the remaining service time:P{empty}=_
0P{empty|R = t}etdt=_ 0exp__ t0eydy_etdt=_ 0exp_(1 et)_etdt=_
10ex/dx= (1 e/)where the preceding used that P{empty|R = t} is
equal to the probability that anM/M/queue is empty at time t.65.
This is an application of the innite server Pois-son queue model.
An arrival corresponds to a newlawyer passing the bar exam, the
service time isthe time the lawyer practices law. The number inthe
system at time t is, for large t, approximately aPoisson random
variable with mean where isthe arrival rate and the mean service
time. Thislatter statement follows from_ n0[1 G(y)]dy = where is
the mean of the distribution G. Thus, wewould expect 500 30 = 15,
000 lawyers.67. If we count a satellite if it is launched before
times but remains in operation at time t, then the num-ber of items
counted is Poisson with mean m(t) =_ s0G(t y)dy. The answer is
em(t).69. (a) 1 e(ts)(b) ese(ts)[(t s)]3/3!(c) 4 + (t s)(d) 4s/t71.
Let U1, be independent uniform (0, t) randomvariables that are
independent of N(t), andlet U(i, n)be the ithsmallest of the rst n
of them.Answers and Solutions 31P_N(t)i=1g(Si) <
x_=nP_N(t)i=1g(Si) 1(d) Conditioning on N yields the solution;
namelyj=11j P(N = j)(e)j=1P(N = j)ji=01 + i79. Consider a Poisson
process with rate in which anevent at time t is counted with
probability (t)/independently of the past. Clearly such a
processwill have independent increments. In addition,P{2 or more
counted events in(t, t + h)} P{2 or more events in(t, t + h)}=
o(h)andP{1 counted event in (t, t + h)}=P{1 counted | 1 event}P(1
event)32 Answers and Solutions+ P{1 counted | 2 events}P{ 2}=_
t+ht(s)dsh (h + o(h)) + o(h)= (t) h + o(h)=(t)h + o(h)81. (a) Let
Si denote the time of the ith event, i 1.Let ti + hi < ti+1, tn
+ hn t.P{ti < Si < ti + hi, i = 1, , n|N(t) = n}P{1 event in
(ti, ti + hi), i = 1, , n,= no events elsewhere in (0, t)P{N(t) =
n}=_ n
i=1e(m(ti+hi)m(ti))[m(ti + hi) m(ti)]_e[m(t)i
m(ti+hi)m(ti)]em(t)[m(t)]n/n!=nn
i[m(ti + hi) m(ti)][m(t)]nDividing both sides by h1 hn and using
thefact that m(ti + hi) m(ti) =_ ti+hti(s) ds =(ti)h + o(h) yields
upon letting the hi 0:fS1 S2(t1, , tn|N(t) = n)= n!n
i=1[(ti)/m(t)]and the right-hand side is seen to be the
jointdensity function of the order statistics from aset of n
independent random variables fromthe distribution with density
function f (x) =m(x)/m(t), x t.(b) Let N(t) denote the number of
injuries by timet. Now given N(t) = n, it follows from part (b)that
the ninjuryinstances are independent andidentically distributed.
The probability (den-sity) that an arbitrary one of those injuries
wasat s is (s)/m(t), and so the probability thatthe injured party
will still be out of work attime t isp =_ t0P{out of work at
t|injuredat s}(s)m(t)d=_ t0[1 F(t s)](s)m(t)dHence, as each of the
N(t) injured parties havethe same probability p of being out of
work att, we see thatE[X(t)]|N(t)] = N(t)pand thus,E[X(t)]
=pE[N(t)]=pm(t)=_ t0[1 F(t s)](s) ds83. Since m(t) is increasing it
follows that nonover-lapping time intervals of the {N(t)} process
willcorrespond to nonoverlapping intervals of the{No(t)} process.
As a result, the independentincrement property will also hold for
the {N(t)}process. For the remainder we will use the identitym(t +
h) = m(t) + (t)h + o(h)P{N(t + h) N(t) 2}= P{No[m(t + h)] No[m(t)]
2}= P{No[m(t) + (t)h + o(h)] No[m(t)] 2}= o[(t)h + o(h)] =
o(h)P{N(t + h) N(t) = 1}= P{No[m(t) + (t)h + o(h)] No[m(t)] = 1}=
P{1 event of Poisson process in intervalof length (t)h + o(h)]}=
(t)h + o(h)85. $ 40,000 and $1.6 108.87. Cov[X(t), X(t + s)]=
Cov[X(t), X(t) + X(t + s) X(t)]= Cov[X(t), X(t)] + Cov[X(t), X(t +
s) X(t)]= Cov[X(t), X(t)] by independent increments= Var[X(t)] =
tE[Y2]89. Let Ti denote the arrival time of the rst type ishock, i
= 1, 2, 3.P{X1 > s, X2 > t}= P{T1 > s, T3 > s, T2 >
t, T3 > t}= P{T1 > s, T2 > t, T3 > max(s, t)}=
e1se2te3max(s, t)Answers and Solutions 3391. To begin, note
thatP_X1 >n2Xi_= P{X1 > X2}P{X1X2 > X3|X1 > X2}=
P{X1X2X3 > X4|X1 > X2 + X3}= P{X1X2 Xn1 > Xn|X1 > X2+ +
Xn1}= (1/2)n1Hence,P_M >ni=1XiM_=ni1P_X1>nj=iXi_=n/2n193. (a)
max(X1, X2) + min(X1, X2) = X1 + X2.(b) This can be done by
induction:max{(X1, , Xn)= max(X1, max(X2, , Xn))= X1+ max(X2, ,
Xn)min(X1, max(X2, , Xn))= X1+ max(X2, , Xn)max(min(X1, X2), , min
(X1, Xn)).Now use the induction hypothesis.Asecond method is as
follows:Suppose X1 X2 Xn. Then the coef-cient of Xi on the right
side is1 _n i1_+_n i2__n i3_+ = (1 1)ni=_0, i = n1, i = nand so
both sides equal Xn. By symmetry theresult follows for all other
possible orderingsof the X
s.(c) Taking expectations of (b) where Xi is the timeof the rst
event of the ithprocess yieldsi1i i , the departure process will
(in the limit) bea Poisson process with rate since the servers
willalways be busy and thus the time between depar-tures will be
independent random variables eachwith rate .29. (a) Let the state
be S, the set of failed machines.(b) For i S, j Sc,qS, S i = i/|S|,
qS, S+j = jwhere S i is the set S with i deleted and S + jis
similarly S with j added. In addition, |S|denotes the number of
elements in S.(c) PSqS, Si = PSiqS i, SAnswers and Solutions 37(d)
The equation in (c) is equivalent toPSi/|S| = PS iiorPS =
PSi|S|i/iIterating this recursion givesPS = P0(|S|)!
iS(i/i)where 0 is the empty set. Summing over all Sgives1 = P0
S(|S|)!
iS(i/i)and soPS =(|S|)!
iS(i/i)S(|S|)!
iS(i/i)As this solution satises the time reversibilityequations,
it follows that, in the steady state,the chain is time reversible
with these limitingprobabilities.31. (a) This follows because of
the fact that all of theservice times are
exponentiallydistributedandthus memoryless.(b) Let n =(n1, , ni, ,
nj, , nr), whereni >0 and let n
= (n1, , ni1, ,nj1, , nr). Thenqn, n = i/(r 1).(c) The process
is time reversible if we can ndprobabilities P(n) that satisfy the
equationsP(n)i/(r 1) = P(n
)j/(r 1)where n and n
are as given in part (b). Theabove equations are equivalent
toiP(n) =j/P(n
)Since ni = n
i + 1 and n
j = nj + 1 (where nkrefers to the kthcomponent of the vector n),
theabove equation suggests the solutionP(n) = Cr
k=1(1/k)nkwhere C is chosen to make the probabili-ties sum to 1.
As P(n) satises all the timereversibility equations it follows that
the chainis time reversible and the P(n) given above arethe
limiting probabilities.33. Suppose rst that the waiting room is
ofinnite size. Let Xi(t) denote the number of cus-tomers at server
i, i = 1, 2. Then since each ofthe M/M/1 processes {Xi(t)} is
time-reversible,it follows by Problem 28 that the vector
process{(X1(t), X2(t)), t 0} is a time-reversible Markovchain. Now
the process of interest is just the trun-cation of this vector
process to the set of states AwhereA = {(0, m) : m 4} {(n, 0) : n
4}{(n, m) : nm > 0, n + m 5}Hence, the probability that there
are n with server 1and n with server 2 isPn, m=k(1/1)n(1
1/1)(2/2)m(1 2/2),= C(1/1)n(2/2)m, (n, m) AThe constant C is
determined fromPn, n = 1where the sum is over all (n, m) in A.35.
We must nd probabilities Pni such thatPni qnij = Pnj qnjiorcPni
qij=Pnj qji, if i A, j / APiqij=cPnj qji, if i / A, j APiqij=Pjqji,
otherwiseNow, Piqij = Pjqji and so if we letPni = kPi/c if i AkPi
if i / Athen we have a solution to the above equations. Bychoosing
k to make the sumof the Pnj equal to 1, wehave the desired result.
That is,k =_iAPi/c i / APi_137. The state of any time is the set of
downcomponents at that time. For S {1, 2, , n},i / S, j Sq(S, S +
i) = iq(S, S j) = j|S|where S + i =S {i}, S j =S {j}c, |S|
=numberof elements in S.38 Answers and SolutionsThe time reversible
equations areP(S)i|S| = P(S i)i, i SThe above is satised when, for
S = {i1, i2, , ik}P(S) = i1i2 iki1i2 ikk(k+1)/2P()where P() is
determined so thatP(S) = 1where the sum is over all the 2nsubsets
of{1, 2, , n}.39. E[0(t)|x(0) = 1] = t E[time in 1|X(0) = 1]= t t +
( + )2[1 e(+)t]The nal equality is obtained from Example 7b
(orProblem 38) by interchanging and .41. (a) Letting Ti denote the
time until a transition outof i occurs, we havePij=P{X(Y) = j} =
P{X(Y) = j | Ti < Y} vivi + + P{X(Y) = j|Y Ti} + vi=kPikPkjvivi
+ + ij + viThe rst term on the right follows upon con-ditioning on
the state visited fromi (which is kwith probability Pik) and then
using the lack ofmemory property of the exponential to assertthat
given a transition into k occurs before timeY then the state at Y
is probabilistically thesame as if the process had started in state
kand we were interested in the state after anexponential time with
rate . As qik = viPik,the result follows.(b) From (a)( +
vi)Pij=kqik Pkj + ijorij=krik PkjPijor, in matrix terminology,I =RP
IP=(RI)Pimplying thatP = I(RI)1= (R/ I)1= (I R/)1(c) Consider, for
instance,P{X(Y1 + Y2) = j|X(0) = i}=kP{X(Y1 + Y2) = j|X(Y1) = k,
X(0) = i)P{X(Y1) = k|X(0) = i}=kP{X(Y1 + Y2) = j|X(Y1) =
k}Pik=kP{X(Y2) = j|X(0) = k}Pik=kPkjPikand thus the state at time
Y1 + Y2 is just the2-stage transition probabilities of Pij. The
gen-eral case can be established by induction.(d) The above results
in exactly the same approx-imation as Approximation 2 in Section
6.8.Chapter 71. (a) Yes, (b) no, (c) no.3. By the one-to-one
correspondence of m(t) and F, itfollows that {N(t), t 0} is a
Poisson process withrate 1/2. Hence,P{N(5) = 0) = e5/25. The random
variable N is equal to N(I) + 1 where{N(t)} is the renewal process
whose interarrivaldistribution is uniform on (0, 1). By the results
ofExample 2c,E[N] = a (1) + 1 = e7. Once every ve months.9.
Ajobcompletionconstitutes areneval. Let Tdenotethe time between
renewals. To compute E[T] startby conditioning on W, the time it
takes to nish thenext job:E[T] = E[E[T|W]]Now, to determine E[T|W =
w] condition on S, thetime of the next shock. This givesE[T|W = w]
=_0E[T|W = w, S = x]exdxNow, if the time to nish is less than the
time of theshock then the job is completed at the nish
time;otherwise everything starts over when the shockoccurs. This
givesE[T|W = w, S = x] =_x + E[T], if x < ww, if x wHence,E[T|W
= w]=w_0(x + E[T])exdx + w_wexdx=E[T][1ew] +1/ wew1ewwewThus,E[T|W]
= (E[T] + 1/)(1 eW)Taking expectations givesE[T] = (E[T] + 1/)(1
E[eW])and soE[T] = 1 E[eW]E[eW]In the above, W is a randomvariable
having distri-bution F and soE[eW] =_0ewf (w)dw11. N(t)t = 1t +
number of renewals in (X1, t)tSince X1 < , Proposition 3.1
implies thatnumber of renewals in (X1, t)t 1 as t .13. (a) N1 and
N2 are stopping times. N3 is not.(b) Follows immediately from the
denition of Ii.(c) The value of Ii is completely determinedfrom X1,
, Xi1 (e.g., Ii = 0 or 1 depend-ing upon whether or not we have
stoppedafter observing X1, , Xi1). Hence, Ii is inde-pendent of
Xi.(d)i=1E[Ii] =i=1P{N i} = E[N](e) E_X1 + + XN1= E[N1]E[X]But X1 +
+ XN1 = 5, E[X] = p and soE[N1] = 5/pE_X1 + + XN2= E[N2]E[X]E[X] =
p, E[N2] = 5p + 3(1 p) = 3 + 2pE_X1 + + XN2= (3 + 2p)p3940 Answers
and Solutions15. (a) Xi =amount of timehehas totravel after his
ithchoice (we will assume that he keeps on mak-ing choices even
after becoming free). N is thenumber of choices he makes until
becomingfree.(b) E[T] = E_N1Xi_= E[N]E[X]N is a geometric random
variable withP = 1/3, soE[N] = 3, E[X] = 13(2 + 4 + 6) = 4Hence,
E[T] = 12.(c) E_N1Xi|N = n_= (n1)12(4 + 6) + 2 = 5n3, since given N
= n, X1, , Xn1 are equallylikely to be either 4 or 6, Xn = 2, E_n1
Xi_=4n.(d) From (c),E_N1Xi_= E[5N 3] = 15 3 = 1217. (i) Yes. (ii)
NoYes, if F exponential.19. Since, from Example 2c, m(t) = et1, 0
< t 1,we obtain upon using the identity t + E[Y(t)] =[m(t) + 1]
that E[Y(1)] = e/2 1.21. G + 1/, where G is the mean of G.23. Using
that E[X] = 2p 1, we obtain from Waldsequation when p = 1/2
thatE[T](2p 1) = E_ Tj=1Xj_= (N i) 1 (q/p)i1 (q/p)N i_1 1 (q/p)i1
(q/p)N_= N 1 (q/p)i1 (q/p)N iyielding the result:E[T] =N 1 (q/p)i1
(q/p)N i2p 1 , p = 1/2When p = 1/2, we can easily show by a
condition-ing argument that E[T] = i(N i)25. Say that a new cycle
begins each time a train isdispatched. Then, with C being the cost
of a cycle,we obtain, upon conditioning on N(t), the numberof
arrivals during a cycle, thatE[C] =E[E|C|N(t)]] = E[K + N(t)ct/2]=k
+ ct2/2Hence,average cost per unit time = E[C]t = Kt + ct/2Calculus
shows that the preceding is minimizedwhen t =_2K/(c), with the
average cost equal to2Kc.On the other hand, the average cost for
the Npolicy of Example 7.12 is c(N1)/2 +K/N. Treat-ing N as a
continuous variable yields that itsminimum occurs at N=_2K/c, with
a resultingminimal average cost of2Kc c/2.27. Say that a new cycle
begins when a machine fails;let C be the cost per cycle; let T be
the time of acycle.E[C] = K + c21 + 2+ 11 + 2c12+ 21 + 2c11E[T] =
11 + 2+ 11 + 212+ 21 + 211T the long-run average cost per unit time
isE[C]/E[T].29. (a) Imagine that you are paid a reward equal toWi
on day i. Since everything starts over whena busy period ends, it
follows that the rewardprocess constitutes a renewal reward
processwithcycle time equal to N andwiththe rewardduring a cycle
equal to W1 + + WN. ThusE[W], the average reward per unit time,
isE[W1 + + WN]/E[N].(b) The sum of the times in the system of
allcustomers and the total amount of work thathas been processed
both start equal to 0 andboth increase at the same rate. Hence,
they arealways equal.(c) This follows from (b) by looking at the
valueof the two totals at the end of the rst busyperiod.(d) It is
easy to see that N is a stopping timefor the Li, i 1, and so, by
Walds Equation,E_ Ni=1Li_= E[L]E[N]. Thus, from (a) and (c),we
obtain that E[W] = E[L].Answers and Solutions 4131. P{E(t) >
x|A(t) = s}=P{0 renewals in (t, t + x]|A(t) = s}=P{interarrival
> x + s|A(t) = s}=P{interarrival > x + s|interarrival >
s}= 1 F(x + s)1 F(s)33. Let B be the amount of time the server is
busy ina cycle; let X be the remaining service time of theperson in
service at the beginning of a cycle.E[B] =E[B|X < t](1 et) +
E[B|X > t]et=E[X|X < t](1 et) +_t + 1 + _et=E[X] E[X|X >
t]et+_t + 1 + _et= 1 _t + 1_et+_t + 1 + _et= 1_1 + et_More
intuitively, writing X=B + (XB), and not-ing that X B is the
additional amount of servicetime remaining when the cycle ends,
givesE[B] =E[X] E[X B]= 1 1P(X > B)= 1 1et + The long-run
proportion of time that the server isbusy is E[B]t + 1/.35. (a) We
can view this as an M/G/system wherea satellite launching
corresponds to an arrivaland F is the service distribution.
Hence,P{X(t) = k} = e(t)[(t)]k/k!where (t) = _ t0(1 F(s))ds.(b) By
viewing the system as an alternatingrenewal process that is
onwhenthere is at leastone satellite orbiting, we obtainlimP{X(t) =
0} = 1/1/+ E[T]where T, the on time in a cycle, is the quantityof
interest. From part (a)limP{X(t) = 0} = ewhere =_ 0(1 F(s))ds is
the mean timethat a satellite orbits. Hence,e= 1/1/+ E[T]and soE[T]
= 1 ee37. (a) This is an alternating renewal process, withthe mean
off time obtained by conditioning onwhich machine fails to cause
the off period.E[off] =3i=1E[off|i fails]P{i fails}=(1/5) 11 + 2 +
3+(2) 21 + 2 + 3+ (3/2) 31 + 2 + 3As the on time in a cycle is
exponential withrate equal to 1 + 2 + 3, we obtain that p,the
proportion of time that the system isworking isp = 1/(1 + 2 +
3)E[C]whereE[C] =E[cycle time]=1/(1 + 2 + 3) + E[off](b) Think of
the system as a renewal reward pro-cess by supposing that we earn 1
per unit timethat machine 1 is being repaired. Then, r1,
theproportion of time that machine 1 is beingrepaired isr1 =(1/5)
11 + 2 + 3E[C](c) By assuming that we earn1 per unit time
whenmachine 2 is in a state of suspended anima-tion, shows that,
with s2 being the propor-tion of time that 2 is in a state of
suspendedanimation,s2 =(1/5) 11 + 2 + 3+ (3/2) 31 + 2 + 3E[C]39.
Let B be the length of a busy period. With S equalto the service
time of the machine whose failure42 Answers and Solutionsinitiated
the busy period, and T equal to theremaininglife of the other
machine at that moment,we obtainE[B] =_ E[B|S =s]g(s)dsNow,E[B|S =
s] =E[B|S = s, T s](1 es)+ E[B|S = s, T > s]es=(s + E[B])(1 es)
+ ses=s + E[B](1 es)Substituting back givesE[B] = E[S] + E[B]E[1
es]orE[B] = E[S]E[es]Hence,E[idle] = 1/(2)1/(2) + E[B]41._ 10(1
F(x)dx=_ 102 x2 dx = 34 in part (i)_ 10exdx = 1 e1in part (ii)43.
Since half the interarrival times will be
exponentialwithmean1andhalf will be exponential withmean2, it
wouldseemthat because the exponentials withmean 2 will last, on
average, twice as long, thatFe(x) =23ex/2+ 13exWith =(1)1/2 +
(2)1/2 =3/2 equal to the meaninterarrival timeFe(x) =_ xF(y) dyand
the earlier formula is seen to be valid.45. The limiting
probabilities for the Markov chain aregiven as the solution ofr1 =
r212 + r3r2 = r1r1 + r2 + r3 = 1orr1 = r2 = 25, r3 = 15(a) r1 =
25(b) Pi = riii riiand so,P1 = 29, P2 = 49, P3 = 39.47. (a) By
conditioning on the next state, we obtainthe following:j=E[time in
i]=E[time in i|next state is j]Pij=itijPij(b) Use the hint.
Then,E[reward per cycle]= E[reward per cycle|next state is j]Pij=
tijPijAlso,E[time of cycle] = E[time between visits to i]Now, if we
hadsupposeda rewardof 1 per unittime whenever the process was in
state i and0 otherwise then using the same cycle times asabove we
have thatPi = E[reward is cycle]E[time of cycle] = iE[time of
cycle]Hence,E[time of cycle] = i/Piand soaverage reward per unit
time = tijPijPi/iThe above establishes the result since the
aver-age reward per unit time is equal to the pro-portion of time
the process is in i and will nextenter j.49. Think of each
interarrival time as consisting of nindependent phaseseach of which
is exponen-tially distributed with rate and consider thesemiMarkov
process whose state at any time isthe phase of the present
interarrival time. Hence,this semi-Markov process goes from state 1
to 2 to3 to n to 1, and so on. Also the time spent in eachstate has
the same distribution. Thus, clearly theAnswers and Solutions
43limitingprobabilities of this semi-MarkovchainarePi = 1/n, i = 1,
, n. To compute limP{Y(t) < x},we condition on the phase at time
t and note that ifit is n i + 1, whichwill be the case
withprobability1/n, then the time until a renewal occurs will be
thesum of i exponential phases, which will thus havea gamma
distribution with parameters i and .51. It is an example of the
inspection paradox. Becauseevery tourist spends the same time in
departingthe country, those questioned at departure consti-tute a
random sample of all visiting tourists. Onthe other hand, if the
questioning is of randomlychosen hotel guests then, because longer
stayingguests are more likely to be selected, it follows thatthe
average time of the ones selected will be largerthan the average of
all tourists. The data that theaverage of those selected from
hotels was approx-imately twice as large as from those selected
atdeparture are consistent with the possibility thatthe time spent
in the country by a tourist is expo-nential with a mean
approximately equal to 9.55. E[T(1)] = (.24)2+ (.4)1=
19.8611,E[T(2)] = 24.375, E[T12] = 21.875,E[T2, 1] = 17.3611. The
solution of the equations19.861 =E[M] + 17.361P(2)24.375 =E[M] +
21.875P(1)1 =P(1) + P(2)gives the resultsP(2) .4425, E[M] 12.1857.
P{Ti=1Xi > x} = P{Ti=1Xi > x|T = 0}(1 )+ P{Ti=1Xi > x|T
> 0}= P{Ti=1Xi > x|T > 0}= _ 0P{Ti=1Xi > x|T > 0, X1
= y}F(y) dy= _ x0P{Ti=1Xi > x|T > 0, X1 = y}F(y)dy+ _
xF(y)dy= _ x0h(x y)F(y)dy + _ xF(y)dy= h(0) + _ x0h(x y)F(y)dy _
x0F(y)dywhere the nal equality used thath(0) = = _ 0F(y)dyChapter
81. (a) E[number of arrivals]= E[E{number of arrivals|serviceperiod
is S}]= E[S]= /(b) P{0 arrivals}= E[P{0 arrivals|service period is
S}]= E[P{N(S) = 0}]= E[eS]=_ x0esesds= + 3. Let CM= Marys average
cost/hour and CA=Alices average cost/hour.Then, CM=$3 + $1 (Average
number of cus-tomers in queue when Mary works),and CA=$C+$1
(Average number of cus-tomers in queue when Alice works).The
arrival streamhas parameter = 10, and thereare two service
parametersone for Mary and onefor Alice:M = 20, A = 30.Set
LM=average number of customers inqueue when Mary works
andLA=average number of customers inqueue when Alice works.Then
using Equation (3.2), LM= 10(20 10) = 1LA = 10(20 10) = 12So CM=$3
+ $1/customer LM customers=$3 + $1=$4/hourAlso, CA=$C + $1/customer
LA customers=$C + $1 12=$C + 12 / hour(b) We can restate the
problem this way: If CA =CM, solve for C.4 = C + 12 C =
$3.50/houri.e., $3.50/hour is the most the employershould be
willing to pay Alice to work. At ahigher wage his average cost is
lower withMary working.5. Let I equal 0 if WQ = 0 and let it equal
1 otherwise.Then,E[WQ|I =0] = 0E[WQ|I =1] = ( )1Var(WQ|I =0) =
0Var(WQ|I =1) = ( )2Hence,E[Var(WQ|I] =( )2/Var(E[WQ|I]) =( )2/(1
/)Consequently, bythe conditional variance formula,Var(WQ) = ( )2 +
2( )7. To compute W for the M/M/2, set upbalance equa-tions asp0=p1
(each server has rate )( + )p1=p0 + 2p2( + 2)pn=pn1 + 2pn+1, n
2These have solutions Pn = n/2n1p0 where = /.44Answers and
Solutions 45The boundary conditionn=0Pn = 1 impliesP0 = 1 /21 + /2
= (2 )(2 + )Now we have Pn, so we can compute L, and henceW from L
= W :L =n=0npn=p0n=0n_2_n1=2p0n=0n_2_n=2(2 )(2 + )(/2)(1 /2)2= 4(2
+ )(2 )= 4(2 + )(2 )From L = W we haveW = Wm/m/2 = 4(2 + )(2 )The
M/M/1 queue with service rate 2 hasWm/m/1 = 12 from Equation (3.3).
We assume that in theM/M/1 queue, 2 > so that the queue is
stable.But then 4 > 2 + , or 42 + > 1, whichimplies Wm/m/2
> Wm/m/1.The intuitive explanation is that if one nds thequeue
empty in the M/M/2 case, it would do nogood to have two servers.
One would be better offwith one faster server.Now let
W1Q=WQ(M/M/1)W2Q=WQ(M/M/2)Then,W1Q=Wm/m/1 1/2W2Q=Wm/m/2 1/So,W1Q =
2(2 ) (3.3)andW2Q = 2(2 )(2 + )Then,W1Q > W2Q 12 > (2 + )
< 2Since we assume < 2 for stability in theM/M/1, W2Q <
W1Q whenever this comparison ispossible, i.e., whenever < 2.9.
Take the state to be the number of customers atserver 1. The
balance equations areP0 = P12Pj = Pj+1 + Pj1, 1 j < nPn = Pn11
=nj=0PjIt is easy to check that the solution to these equa-tions is
that all the Pjs are equal, so Pj = 1/(n + 1),j = 0, , n.11. (a) P0
= P1( + )Pn = Pn1 + Pn+1, n 1These are exactly the same equations
as in theM/M/1 with replacing . Hence,Pn =_ _n_1 _, n 0and we need
the condition < .(b) If T is the waiting time until the customer
rstenters service, then conditioning on the num-ber present when he
arrives yieldsE[T] =nE[T|n present]Pn=nnPn= LSince L =nPn, and the
Pn are the same asin the M/M/1 with and , we have thatL = /( ) and
soE[T] = ( )(c) P{enters service exactly n times}= (1 )n1(d) This
is expected number of services meanservices time = 1/46 Answers and
Solutions(e) The distribution is easily seen to be memory-less.
Hence, it is exponential with rate .13. Let the state be the idle
server. The balance equa-tions areRate Leave =Rate Enter,(2 + 3)P1=
11 + 2P3 + 11 + 3P2,(1 + 3)P2= 22 + 3P1 + 22 + 1P3,1 + 2 +
3=1.These are to be solved and the quantity Pi repre-sents the
proportion of time that server i is idle.15. There are four states
=0, 1A, 1B, 2. Balance equa-tions are2P0=2P1B4P1A=2P0 +
2P24P1B=4P1A + 4P26P2=2P1BP0 +P1A + P1B + P2 = 1 P0 = 39P1A = 29,
P1B = 39, P2 = 19(a) P0 + P1B = 23(b) By conditioning upon whether
the state was 0or 1B when he entered we get that the
desiredprobability is given by12 + 1226 = 46(c) P1A + P1B + 2P2 =
79(d) Again, condition on the state when he entersto obtain12_14 +
12_+ 12_14 + 2612_= 712This could also have been obtained from
(a)and (c) by the formula W = La.That is, W =792_23_ = 712.17. The
state space can be taken to consist of states(0, 0), (0, 1), (1,
0), (1, 1), where the ithcomponent ofthe state refers tothe number
of customers at serveri, i = 1, 2. The balance equations are2P0,
0=6P0, 18P0, 1=4P1, 0 + 4P1, 16P1, 0=2P0, 0 + 6P1, 110P1, 1=2P0, 1
+ 2P1, 01 =P0, 0 + P0, 1 + P1, 0 + P1, 1Solving these equations
gives P0, 0 = 1/2,P0, 1 = 1/6, P1, 0 = 1/4, P1, 1 = 1/12.(a) P1, 1
= 1/12(b) W = La= P0, 1 + P1, 0 + 2P1, 12(1 P1, 1) = 722(c) P0, 0 +
P0, 11 P1, 1= 81119. (a) Say that the state is (n, 1) whenever it
is a goodperiod and there are n in the system, and saythat it is
(n, 2) whenever it is a bad period andthere are n in the system, n
= 0, 1.(b) (1 + 1)P0, 1=P1, 1 + 2P0, 2(2 + 2)P0, 2=P1, 2 + 1P0, 1(
+ 1)P1, 1=1P0, 1 + 2P1, 2( + 2)P1, 2=2P0, 2 + 1P1, 1P0, 1 + P0, 2 +
P1, 1 + P1, 2 = 1(c) P0, 1 + P0, 2(d) 1P0, 1 + 2P0, 221. (a)
1P10(b) 2(P0 + P10)(c) 1P10/[1P10 + 2(P0 + P10)](d) This is equal
to the fraction of server 2s cus-tomers that are type 1 multiplied
by the pro-portion of time server 2 is busy. (This is truesince the
amount of time server 2 spends witha customer does not depend on
which type ofcustomer it is.) By (c) the answer is thus(P01 +
P11)1P10/[1P10 + 2(P0 + P10)]23. (a) The states are n, n 0, and b.
State n meansthere are n in the system and state b meansthat a
breakdown is in progress.Answers and Solutions 47(b) Pb = a(1 P0)P0
= P1 + Pb( + + a)Pn = Pn1 + Pn+1, n 1(c) W = L/n =n=1nPa/[(1
Pb)](d) Since rate at which services are completed =(1 P0Pb) it
follows that the proportion ofcustomers that complete service is(1
P0Pb)/a= (1 P0Pb)/[(1 Pb)]An equivalent answer is obtained by
condi-tioning on the state as seen by an arrival. Thisgives the
solutionn=0Pn[/( + a)]n+1where the above uses that the probability
thatn + 1 services of present customers occurbefore a breakdown is
[/( + a)]n+1.(e) Pb25. (a) P0=APA + BPB( + A)PA=aP0 + BP2( +
B)PB=(1 a)P0 + AP2( + A+B)Pn=Pn1 + (A + B)Pn+1
n 2 where P1 = PA + PB.(b) L = PA + PB +n=2nPnAverage number of
idle servers = 2P0 +PA + PB.(c) P0 + PB + AA + Bn=2Pn27. (a) The
special customers arrival rate is act because we must take into
account his ser-vice time. In fact, the mean time between
hisarrivals will be 1/ + 1/1. Hence, the arrivalrate is (1/ +
1/1)1.(b) Clearly we need to keep track of whether thespecial
customer is in service. For n 1,setPn=Pr{n customers in system
regular cus-tomer in service},PSn=Pr{n customers in system, special
cus-tomer in service}, andP0=Pr{0 customers in system}.( + )P0 = P1
+ 1PS1( + + )Pn = Pn1 + Pn+1 + 1PSn+1( + )PSn = Pn1 + PSn1,n 1_PS0
= P0(c) Since service is memoryless, once a customerresumes service
it is as if his service hasstarted anew. Once he begins a
particular ser-vice, he will complete it if and only if the
nextarrival of the special customer is after his ser-vice. The
probability of this is Pr {Service 2, then serving 1srst minimizes
average wait. But the same argu-ment works if c11 > c22,
i.e.,E(S1)c1< E(S2)145. By regarding any breakdowns that occur
during aservice as being part of that service, we see thatthis is
an M/G/1 model. We need to calculate therst two moments of a
service time. Now the timeof a service is the time T until
something happens(either a service completion or a breakdown)
plusany additional time A. Thus,E[S] =E[T + A]=E[T] + E[A]To
compute E[A] we condition upon whether thehappening is a service or
a breakdown. This givesE[A] =E[A|service] + + E[A|breakdown] +
=E[A|breakdown] + =(1/ + E[S]) + Since, E[T] = 1/( + ) we
obtainE[S] = 1 + + (1/ + E[S]) + orE[S] = 1/ + /()We also need
E[S2], which is obtained as follows.E[S2] =E[(T + A)2]=E[T2] +
2E[AT] + E[A2]=E[T2] + 2E[A]E[T] + E[A2]The independence of A and T
follows becausethe time of the rst happening is independent
ofwhether the happening was a service or a break-down. Now,E[A2]
=E[A2|breakdown] + = + E[(down time + S)2]= + _E[down2] +
2E[down]E[S] + E[S2]_= + _ 22 + 2_1 + _+ E[S2]_Hence,E[S2] = 2( +
)2 + 2_ ( + )+ + _1 + __+ + _ 22 + 2_1 + _+ E[S2]_Now solve for
E[S2]. The desired answer isWQ = E[S2]2(1 E[S])In the above, Sis
the additional service neededafter the breakdown is over. Shas the
same dis-tribution as S. The above also uses the fact thatthe
expected square of an exponential is twice thesquare of its
mean.Another way of calculating the moments of S is touse the
representationS =Ni=1(Ti + Bi) + TN+1where N is the number of
breakdowns while a cus-tomer is in service, Ti is the time starting
when ser-vice commences for the ithtime until a happeningoccurs,
and Bi is the length of the ithbreakdown.We now use the fact that,
given N, all of the ran-dom variables in the representation are
indepen-dent exponentials with the Ti having rate + and the Bi
having rate . This yieldsE[S|N] =(N + 1)/( + ) + N/Var(S|N) =(N +
1)/( + )2+ N/2Therefore, since 1 + N is geometric with mean( + )/
(and variance ( + )/2) we obtainE[S] = 1/ + /()and, using the
conditional variance formula,Var(S) =[1/( + ) + 1/]2( + )/2+ 1/[( +
)] + /2)47. For k = 1, Equation (8.1) givesP0= 11 + E(S) = ()() +
E(S) P1 = (ES)1 + E(S)= E(S) + E(S)50 Answers and SolutionsOne can
think of the process as an alteractingrenewal process. Since
arrivals are Poisson, the timeuntil the next arrival is still
exponential withparameter .end of service arrivalend of serviceAASS
statesThe basic result of alternating renewal processes isthat the
limiting probabilities are given byP{being in state S} = E(S)E(A) +
E(S) andP{being in state A} = E(A)E(A) + E(S)These are exactly the
Erlang probabilities givenabove since E[A] =1/. Note this uses
Poissonarrivals in an essential way, viz., to knowthe distri-bution
of time until the next arrival after a serviceis still exponential
with parameter .49. P3=(E[S])33!3j=0(E[S])jj!, = 2, E[S] = 1=
83851. Note that when all servers are busy, the depar-tures are
exponential with rate k. Now seeProblem 26.53. 1/F < k/G, where
F and G are the respectivemeans of F and G.Chapter 91. If xi=0, (x)
= (0i, x).If xi=1, (x) = (1i, x).3. (a) If is series, then (x) =
minixi andso D(x) =1 mini (1 xi) = max xi, and vice versa.(b)
D,D(x) =1 D(1 x)=1 [1 (1 (1 x))]=(x)(c) An n k + 1 of n.(d) Say {1,
2, , r} is a minimal path set. Then(1, 1, ,. .r1, 0, 0, 0) = 1, and
soD(0, 0, ,. .r0, 1, 1, , 1) = 1 (1, 1, ,1, 0, 0, , 0) =0, implying
that {1, 2, , r} is acut set. We can easily show it to be
minimal.For instance,D(0, 0, ,. .r10, 1, 1, , 1)= 1 (1, 1, ,. .r11,
0, 0, , 0) = 1,since (1, 1, ,. .r11, 0, 0, , 0) = 0 since{1, 2, , r
1} is not a path set.5. (a) Minimal path sets are{1, 8}, {1, 7, 9},
{1, 3, 4, 7, 8}, {1, 3, 4, 9},{1, 3, 5, 6, 9}, {1, 3, 5, 6, 7, 8},
{2, 5, 6, 9},{2, 5, 6, 7, 8}, {2, 4, 9}, {2, 4, 7, 8},{2, 3, 7, 9},
{2, 3, 8}.Minimal cut sets are{1, 2}, {2, 3, 7, 8}, {1, 3, 4, 5},
{1, 3, 4, 6},{1, 3, 7, 9}, {4, 5, 7, 8}, {4, 6, 7, 8}, {8, 9}.7.
{1, 4, 5}, {3}, {2, 5}.9. (a) Acomponent is irrelevant if its
functioning ornot functioning can never make a difference asto
whether or not the system functions.(b) Use the representation
(2.1.1).(c) Use the representation (2.1.2).11. r(p) =P{either x1x3
= 1 or x2x4 = 1}P{either of 5 or 6 work}=(p1p3 + p2p4p1p3p2p4)(p5 +
p6p5p5)13. Taking expectations of the identity(X) = Xi(1i, X) + (1
Xi)(0i, X)noting the independence of Xi and (1i, X) and of(0i,
X).15. (a) 732 r_12_ 1 _78_3= 169512The exact value is r(1/2) =
7/32, whichagrees withthe minimal cut lower boundsincethe minimal
cut sets {1}, {5}, {2, 3, 4} do notoverlap.17. E[N2] = E[N2|N >
0]P{N > 0} (E[N|N > 0])2P{N > 0}since E[X2]
(E[X])2.Thus,E[N2]P{N > 0} (E[N|N > 0]P{N >
0})2=(E[N])2Let N denote the number of minimal path setshaving all
of its components functioning. Thenr(p) = P{N > 0}.Similarly, if
we dene N as the number of minimalcut sets having all of its
components failed, then1 r(p) = P{N > 0}.5152 Answers and
SolutionsIn both cases we can compute expressions for E[N]andE[N2]
by writing N as the sumof indicator (i.e.,Bernoulli) random
variables. Then we can use theinequality to derive bounds on
r(p).19. X(i) is the system life of an n i + 1 of n
systemeachhavingthe life distributionF. Hence, the resultfollows
from Example 5e.21. (a) (i), (ii), (iv) (iv) because it is
two-of-three.(b) (i) because it is series, (ii) because it can
bethought of as being a series arrangement of 1and the parallel
system of 2 and 3, which asF2 = F3 is IFR.(c) (i) because it is
series.23. (a) F(t) =n
i=1Fi(t)F(t) =ddtF(t)F(t) =nj=1F
j(t)
i=jFj(t)n
i=1Fi(t)=nj=1F
j(t)Fj(t)=nj=1j(t)(b) Ft(a) =P{additional life of t-year-old
> a}=n
1Fi(t + a)Fi(t)where Fi is the life distribution for componenti.
The point being that as the system is series,it follows that
knowing that it is alive at time tis equivalent to knowing that all
componentsare alive at t.25. For x ,1 p = 1 F() = 1 F(x(/x)) [1
F(x)]/xsince IFRA.Hence,1 F(x) (1 p)x/= exFor x ,1 F(x) = 1 F((x/))
[1 F()]x/since IFRA.Hence,1 F(x) (1 p)x/= ex27. If p > p0, then
p = p0for some a (0, 1). Hence,r(p) = r(p0) [r(p0)]= p0= pIf p <
p0, then p0 = pfor some a (0, 1). Hence,p= p0 = r(p0) = r(p)
[r(p)]29. Let X denote the time until the rst failure and letY
denote the time between the rst andsecondfail-ure. Hence, the
desired result isEX + EY = 11 + 2+ EYNow,E[Y] =E[Y|1 component
fails rst] 11 + 2+ E[Y|2 component fails rst] 21 + 2= 1211 + 2 +
1121 + 231. Use the remark following Equation (6.3).33. The exact
value can be obtained by conditioningon the ordering of the random
variables. Let Mdenote the maximum, then with Ai,j,k being theeven
that Xi < Xj < Xk, we have thatE[M] =E[M|Ai, j, k]P(Ai, j,
k)where the preceding sum is over all 6 possible per-mutations of
1, 2, 3. This can now be evaluated byusingP(Ai, j, k) = ii + j +
kjj + kE[M|Ai, j, k] = 1i + j + k+ 1j + k+ 1k35. (a) It follows
when i = 1 since 0 = (1 1)n= 1 _n1 +_n2 [nn]. So assume it true
fori and consider i + 1. We must show that_n 1i_=_ ni + 1__ ni +
2_+ _nn_which, using the induction hypothesis, isequivalent to_n
1i_=_ni__n 1i 1_which is easily seen to be true.Answers and
Solutions 53(b) It is clearly true when i = n, so assume it for
i.We must show that_n 1i 2_=_ ni 1__n 1i 1_+ _nn_which, using the
induction hypothesis,reduces to_n 1i 2_=_ ni 1__n 1i 1_which is
true.Chapter 101. X(s) + X(t) = 2X(s) + X(t) X(s).Now 2X(s) is
normal with mean 0 and variance 4sandX(t) X(s) is normal withmean0
andvariancet s. As X(s) and X(t) X(s) are independent, itfollows
that X(s) + X(t) is normal with mean 0 andv