7/21/2019 Solutions to Axler Linear Algebra Done Right PDF http://slidepdf.com/reader/full/solutions-to-axler-linear-algebra-done-right-pdf 1/99 Solutions to Axler, Linear Algebra Done Right 2nd Ed. Edvard Fagerholm edvard.fagerholm@¦helsinki. [gmail.com¦ Beware of errors. I read the book and solved the exercises during s pring break (one week), so the problems were solved in a hurry. However, if you do nd better or interesting solutions to the problems, I'd still like to hear about them. Also please don't pu t this on the Internet to encourage copying homework solutions... 1 Vector Spaces 1. Assuming that C is a eld, write z = a + bi. Then we have 1/z = z/zz = z/[z[ 2 . Plugging in the numbers we get 1/(a + bi) = a/(a 2 + b 2 ) - bi/(a 2 + b 2 ) = c + di. A straightforward calculation of (c+di)(a+bi) = 1 shows that this is indeed an inv erse. 2. Just calculate ((1 + Ö 3)/2) 3 . 3. We have v + (-v) = 0, so by the uniqueness of the additive inv erse (prop. 1.3) -(-v) = v. 4. Choose a ,= 0 and v ,= 0. Then assuming av = 0 we get v = a -1 av = a -1 0 = 0. Contradiction. 5. Denote the set in question by A in each part. (a) Let v, w Î A, v = (x 1 , x 2 , x 3 ), w = (y 1 , y 2 , y 3 ). Then x 1 + 2x 2 + 3x 3
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
Solutions to Axler, Linear Algebra Done Right 2nd Ed.Edvard Fagerholmedvard.fagerholm@¦helsinki.[gmail.com¦Beware of errors. I read the book and solved the exercises during spring break (oneweek), so the problems were solved in a hurry. However, if you do nd better orinterestingsolutions to the problems, I'd still like to hear about them. Also please don't put this onthe Internet to encourage copying homework solutions...1 Vector Spaces1. Assuming that C is a eld, write z = a + bi. Then we have 1/z= z/zz = z/[z[2.Plugging in the numbers we get 1/(a + bi) = a/(a2+ b2) - bi/(a2+ b2
) = c + di. Astraightforward calculation of (c+di)(a+bi) = 1 shows that this is indeed an inverse.2. Just calculate ((1 +Ö 3)/2)3.3. We have v + (-v) = 0, so by the uniqueness of the additive inverse (prop. 1.3)-(-v) = v.4. Choose a ,= 0 and v ,= 0. Then assuming av = 0 we get v = a-1
av = a-10 = 0.Contradiction.5. Denote the set in question by A in each part.(a) Let v, w Î A, v = (x1, x2, x3), w = (y1
, y2, y3). Then x1 + 2x2 + 3x3
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
2(x2 +y2) +3(x3 +y3), so v +w Î A. Similarly 0 = a0 = ax1 +2ax2 +3ay
3, soav Î A. Thus A is a subspace.(b) This is not a subspace as 0 ,Î A.(c) We have that (1, 1, 0) Î A and (0, 0, 1) Î A, but (1, 1, 0)+(0, 0, 1) = (1, 1, 1) ,Î A,so A is not a subspace.(d) Let (x1, x2, x3
), (y1, y2, y3) Î A. If x1 = 5x3
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
and y1 = 5y3, then ax1 = 5ax3,so a(x1, x2, x3) Î A. Similarly x1 + y1 = 5(x3 + y3), so that (x
1, x2, x3) +(y1, y2, y3) Î A. Thus A is a subspace.
16. Set U = Z2.7. The set ¦(x, x) Î R2[ x Î R¦ Ȧ(x, -x) Î R2[ x Î R¦ is closed under multiplicationbut is trivially not a subspace ((x, x) + (x, -x) = (2x, 0) doesn't belong to it unlessx = 0).8. Let ¦V
i¦ be a collection of subspaces of V . Set U = ÇiVi. Then if u, v Î U. Wehave that u, v Î Vi for all i because Vi
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
is a subspace. Thus au Î Vi for all i, so thatav Î U. Similarly u + v Î Vi for all i, so u + v Î U.9. Let U, W ⊂ V be subspaces. Clearly if U ⊂ W or W ⊂ U, then U È W is clarly asubspace. Assume then that U ,⊂ W and W ,⊂ U. Then we can choose u Î U ¸ Wand w Î W ¸ U. Assuming that U È W is a subspace we have u + w Î U È W.Assuming that u + w ⊂ U we get w = u + w -u ⊂ U. Contradiction. Similarly foru + w Î W. Thus U È W is not a subspace.10. Clearly U = U + U as U is closed under addition.11. Yes and yes. Follows directly from commutativity and associativity of vector addition.12. The zero subspace, ¦0¦, is clearly an additive identity. Assumingthat we haveinverses, then the whole space V should have an inverse U such that U + V= ¦0¦.Since V + U = V this is clearly impossible unless V is the trivial vectorspace.13. Let W = R2. Then for any two subspaces U
1, U2 of W we have U1 + W = U2 + W,so the statement is clearly false in general.14. Let W = ¦p Î T(F) [ p =
ni=0
aixi, a2 = a5 = 0¦.15. Let V = R2. Let W = ¦(x, 0) Î R2
[ x Î R¦. Set U1 = ¦(x, x) Î R2[ x Î R¦,U2 = ¦(x, -x) Î R2[ x Î R¦. Then it's easy to see that U
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
1 + W = U2 + W = R2,but U1 ,= U2, so the statement is false.2 Finite
Dimensional Vector Spaces1. Let un = vn and ui = vi+vi+1, i = 1, . . . , n-1. Now we see that vi
=nj=i ui. Thusvi Î span(u1, . . . , un
), so V = span(v1, . . . , vn) ⊂ span(u1, . . . , un).2. From the previous exercise we know that the span is V . As (v1, . . . , vn
) is a linearlyindependent spanning list of vectors we know that dimV = n. The claim now followsfrom proposition 2.16.23. If (v1 + w, . . . , vn + w) is linearly dependent, then we can write
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
(u1, . . . , un) spans V .312. Let U = span(u0, . . . , um). As U ,= Tm(F) we have by the previous exercise thatdimU < dimTm(F) = m + 1. As (p0, . . . , pm) is a spanning list of vectors havinglength m + 1 it is not linearly independent.13. By theorem 2.18 we have 8 = dimR8= dimU + dimW - dim(U Ç W) = 4 + 4 -
dim(U ÇW). Since U +W = R8, we have that dim(U ÇW) = 0 and the claim follows14. Assuming that U Ç W = ¦0¦ we have by theorem 2.18 that9 = dimR9= dimU + dimW -dim(U Ç W) = 5 + 5 -0 = 10.Contradiction.15. Let U1 = ¦(x, 0) Î R2[ x Î R¦, U
2 = ¦(0, x) Î R2[ x Î R¦, U3 = ¦(x, x) Î R2[ x ÎR¦. Then U1Ç U2 = ¦0¦, U
1Ç U3 = ¦0¦ and U2Ç U3 = ¦0¦. Thus the left
hand sideof the equation in the problem is 2 while the right
hand side is 3. Thus we have a
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
counterexample.16. Choose a basis (ui1, . . . , uini) for each Ui. Then the list of vectors (u11, . . . , umnm) haslength dimU1 + . . . + dimUm and clearly spans U1 + . . . + U
m proving the claim.17. By assumption the list of vectors (u11, . . . , umnm) from the proof of the previous exerciseis linearly independent. Since V = U1+. . .+Un
= span(u11, . . . , umnm) the claim follows.3 Linear Maps1. Any v ,= 0 spans V . Thus, if w Î V , we have w = av for some a Î F. Now T(v) = avfor some a Î F, so for w = bv Î V we have T(w) = T(bv) = bT(v) = bav = aw. ThusT is multiplication by a scalar.2. Dene f by e.g.
f(x, y) = _ x, x = y0, x ,= y ,then clearly f satises the condition, but is non
linear.3. To dene a linear map it's enough to dene the image of the elements of a basis.Choose a basis (u1
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
, . . . , um) for U and extend it to a basis (u1, . . . , um, v1, . . . , vn) ofV . If S Î L(U, W). Choose some vectors w1, . . . , wn Î W. Now dene T Î L(V, W)by T(ui) = S(ui), i = 1, . . . , m and T(vi) = wi
, i = 1, . . . , n. These relations denethe linear map and clearly T|U = S.4. By theorem 3.4 dimV = dimnull T +dimrange T. If u ,Î null T, then dimrange T >0, but as dimrange T £ dimF = 1 we have dimrange T = 1 and it follows that4dimnull T = dimV -1. Now choose a basis (u1, . . . , un
) for null T. By our assumption(u1, . . . , un, u) is linearly independent and has length dimV . Hence, it's a basis forV . This implies thatV = span(u1, . . . , un, u) = span(u1
, . . . , un) Åspan(u) = null T Ŧau [ a Î F¦.5. By linearity 0 = ni=1 aiT(v
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
vi = 0, so that ai = 0 for all i. Hence (T(v1), . . . , T(vn)) is linearlyindependent.6. If n = 1 the claim is trivially true. Assume that the claim is true for n = k.If S
1, . . . , Sk+1 satisfy the assumptions, then S1 Sk is injective by the inductionhypothesis. Let T = S1 Sk. If u ,= 0, then by injectivity of S
k+1 we Sk+1u ,= 0and by injectivity of T we have TSk+1u ,= 0. Hence TSk+1 is injective and the claimfollows.
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
7. Let w Î W. By surjectivity of T we can nd a vector v Î V suchthat T(v) = w.Writing v = a1v1 + . . . + anvn we get w = T(v) = T(a1v1 + . . . + anvn) = a1T(v1) +. . . + a
nT(vn) proving the claim.8. Let (u1, . . . , un) be a basis for null T and extend it to a basis (u1, . . . , un, v
1, . . . , vk)of V . Let U := span(v1, . . . , vk). Then by construction null T Ç U = ¦0¦ and for anarbitrary v Î V we have that v = a1u1
+ . . . + anun + b1vn + . . . + bk
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
vk, so thatT(v) = b1T(v1) + . . . + bkT(vk).Hence range T = T(U).9. It's easy to see that (5, 1, 0, 0), (0, 0, 7, 1) is a basis for null T (seeexercise 2.8). Hencedimrange T = dimF4- dimnull T = 4 - 2 = 2. Thus range T = F2, so that T issurjective.10. It's again easy to see that (3, 1, 0, 0, 0), (0, 0, 1, 1, 1) is a basis of null T. Hence, we getdimrange T = dimF
5-dimnull T = 5 -2 = 3 which is impossible.11. This follows trivially from dimV = dimnull T + dimrange T.12. From dimV = dimnull T +dimrange T it follows trivially that if we have a surjectivelinear map T Î L(V, W), then dimV ³ dimW. Assume then that dimV ³ dimW.Choose a basis (v1, . . . , vn) for V and a basis (w1, . . . , w
m) for W. We can denea linear map T Î L(V, W) by letting T(vi) = wi, i = 1, . . . , m, T(vi) = 0, i =m + 1, . . . , n. Clearly T is surjective.513. We have that dimrange T £ dimW. Thus we get dimnull T = dimV -dimrange T ³dimV -dimW. Choose an arbitrary subspace U of V of such that dimU ³ dimV -
dimW. Let (u1, . . . , un) be a basis of U and extend it to a basis (u1, . . . , un, v1
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
, . . . , vk)of V . Now we know that k £ dimW, so let (w1, . . . , wm) be a basis for W. Denea linear map T Î L(V, W) by T(ui) = 0, i = 1, . . . , n and T(vi) = wi, i = 1, . . . , k.Clearly null T = U.14. Clearly if we can nd such a S, then T is injective. Assume then thatT is injective.Let (v1, . . . , vn) be a basis of V . By exercise 5 (T(v1
), . . . , T(vn)) is linearly inde
pendent so we can extend it to a basis (T(v1), . . . , T(vn), w1, . . . , wm) of W. DeneS Î L(W, V ) by S(T(v
i)) = vi, i = 1, . . . , n and S(wi) = 0, i = 1, . . . , m. ClearlyST = IV .15. Clearly if we can nd such a S, then T is surjective. Otherwise let (v1, . . . , vn
) be abasis of V . By assumption (T(w1), . . . , T(wn)) spans W. By the linear dependencelemma we can make the list of vectors (T(w1), . . . , T(wn
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
)) a basis by removing somevectors. Without loss of generality we can assume that the rst m vectors formthebasis (just permute the indices). Thus T(w1), . . . , T(wm)) is a basis of W. Denethe map S Î L(W, V ) by S(T(wi)) = wi, i = 1, . . . , m. Now clearly TS = IW.16. We have that dimU = dimnull T + dimrange T £ dimnull T + dimV . SubstitutingdimV = dimnull S + dimrange S and dimU = dimnull ST + dimrange ST we getdimnull ST + dimrange ST £ dimnull T + dimnull S + dimrange S.Clearly dimrange ST £ dimrange S, so our claim follows.17. This is nothing but pencil pushing. Just take arbitrary matrices satisfying the re
quired dimensions and calculate each expression and the equalities easily fall out.
18. Ditto.19. Let V = (x1, . . . ,n ). From proposition 3.14 we have that/(Tv) = /(T)/(v) =
_ ¸
_ a1,1 a
1,n... ... ...am,1 a
m,n _ ¸
_ _ ¸
_ x1.
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
1,nxn, . . . , am,1x1 + . . . + am,nxn).20. Clearly dimMat(n, 1, F) = n = dimV . We have that Tv = 0 if
and only if v =0v1 + . . . , 0vn = 0, so null T = ¦0¦. Thus T is injective and hence invertible.621. Let ei denote the n 1 matrix with a 1 in the ith row and 0 everywhere else. Let T
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
be the linear map. Dene a matrix A := [T(e1), . . . , T(en)]. Now it's trivial to verifythatAei = T(ei).By distributivity of matrix multiplication (exercise 17) we get for an arbitrary v =a1e1 + . . . + anen thatAv = A(a1
e1 + . . . + anen) = a1Ae1 + . . . + anAe
n= a1T(e1) + . . . + anT(en) = T(a1e1
+ . . . + anen),so the claim follows.22. From theorem 3.21 we have ST invertible Û ST bijective Û S and T bijective Û Sand T invertible.23. By symmetry it's sucient to prove this in one direction only. Thus if TS =
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
I. ThenST(Su) = S(TSu) = Su for all u. As S is bijective Su goes through the whole spaceV as u varies, so ST = I.24. Clearly TS = ST if T is a scalar multiple of the identity.For the other directionassume that TS = ST for every linear map S Î L(V ).25. Let T Î L(F2) be the operator T(a, b) = (a, 0) and S Î L(F2) the operator S(a, b) =(0, b). Then neither one is injective and hence invertible by 3.21. However, T + S isthe identity operator which is trivially invertible. This can be trivially generalizedto spaces arbitrary spaces of dimension ³ 2.26. WriteA :=
_ ¸
_ a1,1
a1,n... ... ...an,1
an,n _ ¸
_, x = _ ¸
_ x1...
xn _ ¸
_.Then the system of equations in a) reduces to Ax = 0. Now A denes a linear mapfrom Mat(n, 1, F) to Mat(n, 1, F). What a) states is now that the map is injectivewhile b) states that it is surjective. By theorem 3.21 these are equivalent.
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
zero, otherwise k + 1). This is a contradiction asdimrange T = k and span(l1v1, . . . , lk+2vk+2) ⊂ range T.10. As T = (T-1)-1we only need to show this in one direction. If T is invertible, then0 is not an eigenvalue. Now let l be an eigenva
ue of T and v the correspondingeigenvector. From Tv = lv we get that T-1
lv = lT-1v = v, so that T-1v = l-1v.11. Let l be an eigenva ue of TS and v the corresponding eigenvector. Then we getSTSv = Slv = lSv, so if Sv ,= 0, then it is is an eigenvector for the eigenva
ue l.IfSv = 0, then TSv = 0, so l = 0. As Sv = 0 we know that S is not injective,so ST
is not injective and it has eigenva
ue 0. Thus if l is an eigenva
ue of TS, then it's aneigenva
ue of ST. The other imp
ication fo
ows by symmetry.912. Let (v1, . . . , vn) be a basis of V . By assumption (Tv1, . . . , Tvn) = (l
1v, . . . , lnvn).We need to show that li = lj
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
+ . . . + anvn.Let Ui be the subspace generated by the vectors vj, j ,= i. By our assumption eachUi is an invariant subspace. Let Tv1 = a1v1 + . . . + anvn. Now v1
Î Ui for i > 1. So
et j > 1, then Tv Î Uj imp
es aj = 0. Thus Tv1 = a1v1
. We see that v1 was aneigenvector. The resu
t now fo
ows from the previous exercise.14. C
ear
y (STS-1)n= STnS-1, so
n
i=0ai(STS-1)i=
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
ue of T. Thus we get 0 = q(li) = p(li) -a, so that a = p(li).16. Let T Î L(R2) be the map T(x, y) = (-y, x). On page 78 it was shown that it hasno eigenva
ue. However, T2(x, y) = T(-y, x) = (-x, -y), so -1 it has an eigenva
ue.
17. By theorem 5.13 T has an upper
triangu
ar matrix with respect to some basis (v1, . . . , vn).The c
aim now fo
ows from proposition 5.12.18. Let T Î L(F2) be the operator T(a, b) = (b, a) with respect to the standard basis.Now T2
= I, so T is c
ear
y invertib
e. However, T has the matrix _ 0 11 0
_ .1019. Take the operator T in exercise 7. It is c
ear
y not invertib
e, but has the matrix
_ ¸
_ 1 1
..
. ... ...1 1
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
_.20. By theorem 5.6 we can choose basis (v1, . . . , vn) for V where vi is an eigenvector ofT corresponding to the eigenva
ue li. Let l be the eigenva
ue of S corresponding tovi. Then we getSTvi = Slivi = li
Svi = lilvi = llivi = lTvi = Tlv
i = TSvi,so ST and TS agree on a basis of V . Hence they are equa
.21. C
ear
y 0 is an eigenva
ue and the corresponding eigenvectors are nu
T= W ¸ ¦0¦.Now assume l ,= 0 is an eigenva
ue and v = u + w is a corresponding eigenvector.Then PU,W(u + w) = u = lu + lw Û (1 -l)u = lw. Thus lw Î U, so lw = 0, butl ,= 0, so we have w = 0 which imp
ies (1 -l)u = 0. Because v is an eigenvector we
have v = u + w = u ,= 0, so that 1 - l = 0. Hence l = 1. As we can choose free
your u Î U, so that it's non
zero we see that the eigenvectors corresponding to 1 areu = U ¸ ¦0¦.22. As dimV = dimnu
P + dimrange P, it's c
ear
y sucient to prove that nu
P Çrange P = ¦0¦. Let v Î nu
P Ç range P. As v Î range P, we can nd a u Î V suthat Pu = v. Thus we have that v = Pu = P2
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
2. If ¸u, v) = 0, then ¸u, av) = 0, so by the Pythagorean theorem |u| £ |u| + |av| =|u+av|. Then assume that |u| £ |u+av| for all a Î F. We have |u|2£ |u+av|2Û¸u, u) £ ¸u + av, u + av). Thus we get¸u, u) £ ¸u, u) +¸u, av) +¸av, u) +¸av, av) ,so that-2Re a ¸u, v) £ [a[
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
2|v|2.Choose a = -t ¸u, v) with t > 0, so that2t ¸u, v)2£ t2¸u, v)2|v|2Û 2 ¸u, v)2£ t ¸u, v)2|v|2.If v = 0, then clearly ¸u, v) = 0. If not choose t = 1/|v|2, so that we get2 ¸u, v)
2£ ¸u, v)2.Thus ¸u, v) = 0.3. Let a = (a1,Ö 2a2, . . . ,Ö
nan) Î Rnand b = (b1, b2/Ö 2, . . . , bn/
Ö n) Î Rn. Thise uality is then simply ¸a, b)2£ |a|2|b|2
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
uality we have |u + v|2+ |u - v|2= 2(|u|2+ |v|2).Solving for |v| we get |v| = Ö 17.5. Set e.g. u = (1, 0), v = (0, 1). Then |u| = 1, |v| = 1, |u + v| = 2, |u - v| = 2.Assuming that the norm is induced by an inner
product, we would have by theparallelogram ine
uality8 = 22+ 22= 2(1
2+ 12) = 4,which is clearly false.6. Just use |u| = ¸u, u) and simplify.7. See previous exercise.128. This exercise is a lot trickier than it might seem. I'll prove it for R, the proof for Cis almost identical except for the calculations that are longer and more tedious. Allnorms on a nite
dimensional real vector space are e
uivalent. This gives us
limn®¥|rnx + y| = |lx + y|when rn ® l. This probab
y doesn't make any sense, so check a book on topo
ogyor functiona
ana
ysis or just assume the resu
t.Dene ¸u, v) by¸u, v) = |u + v|2
-|u -v|24 .Trivia
y we have |u|2= ¸u, u), so positivity deniteness fo
ows. Now4(¸u + v, w) -¸u, w) -¸v, w)) = |u + v + w|2-|u + v -w|
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
product. The proof for F = C can be done by dening the inner
productfrom the comp
ex po
arizing identity. And by using the identities¸u, v)0 = 14(|u + v|2-|u -v|2) ⇒ ¸u, v) = ¸u, v)0 +¸u, iv)0 i.and using the properties just proved for ¸, )0
.9. This is rea
y an exercise in ca
cu
us. We have integra
s of the types¸sinnx, sinmx) =
_ p-psinnxsinmxdx¸cos nx, cos mx) =
_ p-pcos nxcos mxdx¸sinnx, cos mx) =
_ p-psinnxcos mxdxand they can be evaluated using the trigonometric identitiessinnxsinmx = cos((n -m)x) -cos((n + m)x)2cos nxcos mx = cos((n -m)x) + cos((n + m)x)2cos nxsinmx = sin((n -m)x) -sin((n + m)x)
2 .10. e1 = 1, then e2 = x-x,11x-x,11. Where
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
lycontain zero vectors. It's easy to see that we can extend the algorithm to workforlinearly de
endent lists by tossing away resulting zero vectors.12. Let (e1, . . . , en-1) be an orthogonal list of vectors. Assume that (e1, . . . , en) and(e1, . . . , en-1, e
n) are orthogonal and s
an the same subs
ace. Then we can write
en = a1e1 + . . . + anen. Now we have ¸e
n, ei) = 0 for all i < n, so that ai = 0 fori < n. Thus we have e
n = anen
and from |e
n| = 1 we have an = ±1.Let (e1, . . . , en
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
n) [.Thus we have |v| = [ ¸v, e1) [ + . . . +[ ¸v, em) [ if and only if ¸v, ei) = 0 for i > m i.e.v Î s
an(e1, . . . , em).1514. It's easy to see that the dierentiation o
erator has a u
er
triangular matrix in theorthonormal basis calculated in exercise 10.15. This follows directly from V = U ÅU^.16. This follows directly from the
revious exercise.17. From exercise 5.21 we have that V = range P Å null P. Let U := range P, then by
assum
tion null P ⊂ U^. From exercise 15, we have dimU = dimV - dimU =dimnull P, so that null P = U^. An arbitrary v Î V can be written as v = Pv +(v -Pv). From P2= P we have that P(v - Pv) = 0, so v - Pv Î null P = U^. Hencethe decomposition v = Pv + (v -Pv) is the unique decomposition in U ÅU^
. NowPv = P(Pv +(v -Pv)) = Pv, so that P is the identity on U. By denition P = PU.18. Let u Î range P, then u = Pv for some v Î V hence Pu = P2v = Pv = u, so P isthe identity on range P. Let w Î null P. Then for a Î F|u|2= |P(u + aw)|2£ |u + aw|
2.By exercise 2 we have ¸u, w) = 0. Thus null P ⊂ (range P)^ and from the dimensionequality null P = (range P)^. Hence the claim follows.19. If TPU
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
= PUTPU clearly U is invariant. Now assume that U is invariant. Then foru Î U we have PUTPUu = PUTu = Tu = TPUu.20. Let u Î U, then we have Tu = TPUu = PUTu Î U. Thus U is invariant. Then letw Î U^. Now we can write Tw = u+u
where u Î U and u
Î U^. Now PUTw = u,but u = PUTw = TPUw = T0 = 0, so Tw Î U^. Thus U
̂is also invariant.21. First we need to nd an orthonormal basis for U. With Gram
Schmidt we get e1 =(1/Ö 2, 1/Ö 2, 0, 0), e2 = (0, 0, 1/
Ö 5, 2/Ö 5). Let U = span(e1, e2). Then we haveu = PU
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
3.1623. There's is nothing special with this exercise, compared to the previous two, exceptthat it takes ages to calculate.24. We see that the map T : T2(R) ® R dened by p ® p(1/2) is linear. From exercise10 we get that (1,
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
= range T.198. The vectors u = (1, 2, 3) and v = (2, 5, 7) are both eigenvectors corresponding todierent eigenvalues. A self
adjoint operator is normal, so by corollary 7.8 if T isnormal that would imply orthogonality of u and v. Clearly ¸u, v) ,= 0, so Tcan't beeven normal much less self
adjoint.9. If T is normal, we can choose a basis of V consisting of eigenvectors of T. Let A bethe matrix of T corresponding to the basis. Now T is self
adjoint if and only if theconjugate transpose of A equals A that is the eigenvalues of T are real.10. For any v Î V we get 0 = T9v -T8v = T8(Tv -v). Thus Tv -v Î null T8= null Tby the normality of T. Hence T(Tv -v) = T
2v -Tv = 0, so T2= T. By the spectraltheorem we can choose a basis of eigenvectors for T such that T has a diagonal matrixwith l1, . . . , ln on the diagona
. Now T2has the diagona
matrix with l
21, . . . , l2n onthe diagona
and from T2= T we must have l2i = li
for a
i. Hence li = 0 or li = 1,so the matrix of T equa
s its conjugate transpose. Hence T is se
f
adjoint.11. By the spectra
theorem we can choose a basis of T such that the matrix of Tcorresponding to the basis is a diagona
matrix. Let l
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
, . . . , vn) of ig nv ctors of Ö TT and w canassum
that v1 corr
sponds to th
ig
nva
u
0. Now
t T = SÖ T
T. D
n
Uby Uv1 = -Sv1, Uvi = Svi, i > 1. C
ar
y U ,= S and T = UÖ T
T. Now choos
u, v Î V . Th n it's asy to v rify that ¸Uu, Uv) = ¸Su, Sv) = ¸u, v), so is anisom try.26. Choos a basis of ig nv ctors for T such that T has a diagona
matrix with ig nva
u
s l1, . . . , ln
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
y the previous exercise it's of the form(x -cos q)(z -cos q) + sin2q = x2-2xcos q + cos2q + sin2qso that b = cos2q + sin2q = 1.3110 Trac and D t rminant1. Th
i, . . . , vi+k), so after doing this for all blocks we canassume that A is upper
triangular. The claim follows immediately.23. This follows trivially from the formula of trace and determinant for a matrix, becausethe formulas only depends on the elements of the matrix.3424. Let dimV = n and let A = /(T), so that B = /(T) equals the conjugate tran
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF
sposeof A i.e. bij = aji. Then we havedet T = det A =
sas(1),1 as(n),n=
sa1,s(1) an,s(n) =
sa
1,s(1) an,s(n)=
sbs(1),1 bs(n),n = det B = det T.
The r
t equality on the
econd line follow
ea
ily that every term in the upper
umi
repre
ented by a term in the lower
um and vice ver
a. The
econd claim follow
immediately from theorem 10.31.25. Let W = ¦(x, y, z) Î R3[ x2+ y2+ z2
< 1¦ and let T be the operator T(x, y, z) =(ax, by, cz). It's easy to see that T(W) is the ellipsoid. Now W is a circle with radius1. Hence[ det T[volume(W) = abc43p = 43
7/21/2019 Solutions to Axler Linear Algebra Done Right PDF