You are on page 1of 7

MATH 110: LINEAR ALGEBRA PRACTICE FINAL SOLUTIONS Question 1. (1) Write f (x) = am xm + am1 xm1 + + a0 .

0 . Then F = f (T ) = am T m + am1 T m1 + k +a0 I. Since T is upper triangular, (T k )ii = Tii for all positive k and i {1, . . . , n}. m1 m Hence Fii = am Tii + am1 Tii + + a0 = f (Tii ). (2) T F = T f (T ) = T (am T m + am1 T m1 + + a0 I) = am T m+1 + am1 T m + + a0 T = (am T m + am1 T m1 + + a0 I)T = f (T )T = F T . (3) Since T is upper triangular, so is each power of T , and hence so is F = f (T ). Therefore, Tij = 0 and Fij = 0 whenever i > j. Hence
n i+1

(F T )i,i+1 =
k=1

Fik Tk,i+1 =
k=i

Fik Tk,i+1 = Fii Ti,i+1 + Fi,i+1 Ti+1,i+1

and
n i+1

(T F )i,i+1 =
k=1

Tik Fk,i+1 =
k=i

Tik Fk,i+1 = Tii Fi,i+1 + Ti,i+1 Fi+1,i+1 .

Then (F T )i,i+1 = (T F )i,i+1 implies that Fi,i+1 (Tii Ti+1,i+1 ) = Ti,i+1 Fi+1,i+1 Fii Ti,i+1 . But, by hypothesis, Tii Ti+1,i+1 = 0, so Fi,i+1 = (Ti,i+1 Fi+1,i+1 Fii Ti,i+1 )/(Tii Ti+1,i+1 ). (4) As before, Tij = 0 and Fij = 0 whenever i > j. So we have
n i+k i+k1

(F T )i,i+k =
j=1

Fij Tj,i+k =
j=i

Fij Tj,i+k = Fi,i+k Ti+k,i+k +


j=i

Fij Tj,i+k

and
n i+k i+k

(T F )i,i+k =
j=1

Tij Fj,i+k =
j=i

Tij Fj,i+k = Tii Fi,i+k +


j=i+1

Tij Fj,i+k .

Equating the two expressions, we obtain


i+k1 i+k

Fi,i+k (Tii Ti+k,i+k ) =


j=i

Fij Tj,i+k
j=i+1

Tij Fj,i+k .

Again, by hypothesis, Tii Ti+k,i+k = 0, so


i+k1 i+k

Fi,i+k = (
j=i

Fij Tj,i+k
j=i+1

Tij Fj,i+k )/(Tii Ti+k,i+k ).


1

MATH 110: PRACTICE FINAL SOLUTIONS

(5) The above calculations work equally well for power series that converge at all eigenn 2n values of T , in place of polynomials. So, noting that cos(x) = (1) x , we can n=0 (2n)! apply the above considerations. Let us set T = /4 7 0 /4

and F = cos(T ). By the above work, F11 = cos(/4) = 2/2,F22 = cos(/4) = 7 2 27 2/2, F21 = 0, and F12 = (T12 F22 F11 T12 )/(T11 T22 ) = 2 + 2 = 0. So 4 4 2/2 0 . F = 2/2 0 Question 2. The rst two parts of this problem only ask you to show that AB and BA have (almost) the same eigenvalues, but dont actually demand you show their characteristic polynomials (almost) agree. Showing the latter condition is stronger than the former, and the hints given actually lead to this stronger result. However there is a simpler method to show the weaker condition, and that is included after the full solution. (1) Without loss of generality, we may assume that A is nonsingular and hence invertible. So A1 (AB)A = BA. Since AB and BA are similar, they have the same eigenvalues. (2) Let us compute the characteristic polynomials of AB and BA. Well be working in the eld F (x), consisting of fractions of polynomials in x with coecients in F . det(xIn AB) = det = det In 0 A In In B 0 xIn AB det = xn det = det = det = det = det xIn 0 0 In xIn 0 0 In xIn B A In xIn B A In det In B A xIn 1 = det In 0 A In In B A xIn In B A xIn In B A xIn 0 In 1 0 x In xIn B A In = det = det In 0 A In In B A xIn

= 1 det xn det

In B A xIn

In 0 1 0 x In = det det xIn B A In In 0 A In

xIn BA B 0 In

= det(xIn BA). Hence AB and BA have the same characteristic polynomial. (3) Again, well be working over F (x). det(xIm AB) = det = det In 0 A Im B In 0 xIm AB det In B A xIm = det In 0 A Im In B A xIm

= 1 det

In B A xIm

MATH 110: PRACTICE FINAL SOLUTIONS

= xmn xn det = xmn det = xmn det = xmn det = xmn det xIn 0 0 Im det

In B A xIm In B A xIm In 0 0 x1 Im

xm det In 0 1 0 x Im xIn B A Im In 0 A Im

xIn 0 0 Im xIn B A Im xIn B A Im

In B A xIm

= xmn det det

1 = xmn det In 0 A Im

xIn B A Im = xmn det

xIn BA B 0 In

= xmn det(xIn BA). Hence det(xIm AB) = xmn det(xIn BA). Here is a simpler method which just demonstrates that AB and BA have the same eigenvalues, except possibly for 0 in the m = n case. Suppose , v is an eigenpair for AB. Then BA(Bv) = B(ABv) = Bv = Bv. As long as Bv = 0, we have that , Bv is an eigenpair for BA. But if = 0, then ABv = v = 0, so Bv = 0. This shows that without any restrictions on m and n, AB and BA have the same non-zero eigenvalues (the eigenvalues go back from BA to AB by symmetry). Now suppose m = n and 0 is an eigenvalue for AB. Then either A or B is singular, so 0 is also an eigenvalue for BA. Now suppose m > n. Then as B is n m, B has non-trivial nullspace, so 0 must be an eigenvalue for AB. 0 may or may not be an eigenvalue for BA. You may want to think of an example where it is not. Question 3. (1) Let ai denote the ith column of A and qi the ith column of Q. Using the algorithm for computing QR decompositions given in class, we obtain R11 = ||a1 || = 2 and q1 = a1 /R11 = 1/2 (1, 1, 1, 1)t. Also, R12 = a2 , q1 = 4, R13 = a3 , q1 = 6, R22 = ||a2 q1 R12 || = ||(3, 3, 3, 3)t|| = 6, and q2 = (3, 3, 3, 3)t /R22 = 1/2 (1, 1, 1, 1)t. Continuing in this fashion, one obtains 1 1 1 2 4 6 1 1 1 1 and R = 0 6 4 . Q= 1 1 1 2 0 0 6 1 1 1 (2) By the computation done in class, x = R1 Qt y = (26, 23, 21)t. Question 4. (1) Multiplying X by P on the left interchanges rows i and n + 1 i of X (for each i {1, 2, . . . , n}). So P X = (Xn+1i,j ). Multiplying P X by P on the right interchanges columns j and n+1j of P X (for each j {1, 2, . . . , n}). So P XP = (Xn+1i,n+1j ). (2) Let J be a Jordan block, with as its eigenvalue. For each i {1, 2, . . . , n}, Jii = , and hence = Jn+1i,n+1i = (P XP )ii, by the above formula. Similarly, each Ji,i+1 = 1, so 1 = Jn+1i,n+2i = (P XP )i,i1. It is also easy to see that since Jij = 0 whenever j {i, i + 1}, (P XP )ij = 0 whenever j {i, i 1}. Hence, / / 1 t P JP = P JP = J .

MATH 110: PRACTICE FINAL SOLUTIONS

Now consider a matrix in Jordan J1 0 J = . . . 0

canonical form: 0 ... 0 J2 . . . 0 . , . .. . . . . . 0 . . . Jk 0 0 . , . .

where the Ji are Jordan blocks. Also, let P1 0 . . . 0 P2 . . . P = . . .. . . . . . 0 0

. . . Pk

where for each i, Pi has the same size as Ji , and each Pi is of the form P of (1). Then t 0 ... 0 P 1 J1 P 1 J1 0 . . . 0 t 0 J2 . . . 0 0 P 2 J2 P 2 . . . 0 t = . . . PJP = . . . .. . . . . . . . . = (J ) . . . . . . . . . t 0 0 . . . Pk Jk P k 0 0 . . . Jk Finally, let A be an n by n complex matrix, and suppose that A = SJS 1 is its Jordan decomposition. Then At = (S 1 )t J t S t = (S t )1 J t S t . But, by the above, J t is similar to J, and hence At is similar to J. By the uniqueness of the Jordan canonical form, J is the Jordan canonical form of At . Question 5. Let D be a diagonal matrix such that Dii = Rii . Also, let L = R D 1 (D is invertible, since A, and hence R, is invertible), and let U = DR. Then LU = R D 1 DR = R R = R (Q Q)R = (QR) (QR) = A A, since Q is unital. Question 6. First note that because {u1 , u2, v1 , v2 , v3 } is a basis for F 5 , these vectors must be distinct and the sets {u1 , u2 } and {v1 , v2 , v3 } are independent. Now clearly ui Ec for each i, so span(u1 , u2 ) Ec , and span(u1 , u2) is 2-dimensional. So dim(Ec ) 2. Likewise, span(v1 , v2 , v3 ) is a 3-dimensional subspace of Ed . Now we have 5 = dim(F 5 ) dim(Ec + Ed ) = dim(Ec ) + dim(Ed ) dim(Ec Ed ). But as c = d, Ec Ed = {0}, so the last term is 0. Thus dim(Ec ) = 2 and dim(Ed ) = 3, so span(u1 , u2) = Ec and likewise for d. Alternatively, as As characteristic polynomial has degree 5, and dim(E ) mult(), the multiplicity of , it is easy to see that the multiplicities of c and d are 2 and 3 respectively. Thus Ec cannot have dimension > 2, so must coincide with span(u1, u2 ), and likewise for d. Question 7. In the following Ill also dene uj = vj for j > r, for simplicity of notation. The given basis consists of eigenvectors for A, and is orthnormal. Therfore A is normal, so A A = AA . Let A = QDQ be the decomposition of A given by this basis (so the columns of Q are, in order, {u1 , . . . , un } and D = Q AQ). Then D = diag(1 , . . . , n ), where i = c for i r and i = d for i > r. For Q is the change of co-ordinates matrix from the given eigenbasis to the standard basis. Or just consider Dij = et Dej = e Q AQej = (Qei ) A(Qej ) = i i = u Auj = u j uj = j ij . i i

MATH 110: PRACTICE FINAL SOLUTIONS

Here I have used that the ui s form an orthnormal set. (7.1) Using A = QDQ, A ui = QD Q ui = QD ei = Q ei = Qei = ui . i i i So A ui = c ui for i r and A ui = d ui for i > r. (7.2) As = {u1 , . . . , un } forms a basis, CS(A) = span(LA ()) = span(Au1 , . . . , Aun ) = span(cu1, . . . , cur , dur+1, . . . , dun ). Likewise, CS(A ) = span(LA ()) = span(c u1 , . . . , c ur , dur+1 , . . . , d un ). If c = 0 and d = 0, it is clear that these spans coincide with span(u1 , . . . , un ) = Cn . If c = 0 = d, then both spans coincide with span(u1, . . . , un ). Generalizing the result of problem 6, this is Ec . If c = 0 = d, we get Ed instead. We cant have c = d as we assumed they were distinct. So we are done. Note that having only 2 distinct eigenvalues wasnt important here; the same arguments generalize to give similar results (such as CS(A) = CS(A ) for A normal) for A having more eigenvalues. Question 8. (8.1) True. A is skew-symmetric (A = At ) and real, so A A = At A = A2 = AA , so A is normal. This is equivalent to A being diagonalizable by a unitary matrix. (Note we are talking about complex matrices here, as the question was about unitary matrices. There actually isnt a factorization A = QDQt with Q real orthogonal and D real diagonal.) (8.2) False. A being unitarily diagonalizable is also equivalent to the existence of an orthnormal basis of eigenvectors for A. But A is upper triangular with eigenvalues 1, 2, 3. Each of the eigenspaces must have dimension 1, and computing them, one easily sees that if v1 E1 and v2 E2 , both non-zero, then v1 and v2 are not orthogonal. (8.3) False. The matrices 1 0 1 1 , 0 2 0 1 do not commute. Just try it. (8.4) True. Such matrices may be simultaneously diagonalized, so they commute, by a homework problem. To see this, let J = QDQ1 . Qs columns are eigenvectors for J, so must also be eigenvectors for K by assumption. Therefore K = QCQ1 for some diagonal matrix C (with the corresponding eigenvalues on its diagonal). Therefore JK = QDQ1 QCQ1 = QDCQ1 = QCDQ1 = KJ. (8.5) False. This requires the rst column of Q1 to be an eigenvector for A. For in the Q1 -basis (given by Q1 s columns), [LA ] = T , and the rst standard basis vector is an eigenvector for an upper triangular matrix. Alternatively, letting Qs rst column be q, q = 0 and AQ1 = T Q, so Aq = A(Q1 e1 ) = Q1 T e1 = Q1 T11 e1 = T11 Q1 e1 = T11 q.

MATH 110: PRACTICE FINAL SOLUTIONS

So A must have a real eigenvector. But this is false for the 90o rotation matrix 0 1 1 0 .

(8.6) False. Let u = [ 1 1 1 1 1 ]t . Note that ut M = ut because M is a probability matrix. Now suppose Mx = y. Then 8 = ut y = ut Mx = ut x = 7. Question 9. Suppose all of f s roots are distinct, and are 1 , . . . , n . Let = diag(1 , . . . , n ). I will show that any complex matrix whose characteristic poly is f is similar to . This is sucient, because if M and N are both similar to , then they are similar to one another. So let M be a complex matrix whose characteristic poly is f . Then M is diagonalizable as it has n distinct eigenvalues. Thus Q1 MQ = , where = diag(1 , . . . , n ). But the i s are the roots of f , as are the i s. So and have the same diagonal entries, but in possibly dierent orders. Thus they are similar through a permutation matrix P , and M was similar to , so M is similar to also. If youre interested, Ill now explicitly construct P . Ill dene a permutation matrix P such that P P t = . For each i, there is a unique j such that i = j , as Ms eigenvalues are f s roots. Let : {1, . . . , n} {1, . . . , n} be the permutation (bijective function) that sends i to j (so i = (i) ). We want the ith row of P to extract i from , which is in the (i)th position in , so set Pi,j = (i),j . Because is a permutation, it is easy to check that P is a permutation matrix. Let Pi be the ith row of P . As P P t = I, Pi Pjt = ij . We have (P P t)ij = Pi Pjt = (i) Pi Pjt = (i) ij = i ij = ij . Therefore P P t = as required. Now we do the other direction. Suppose f s roots, counted according to multiplicity, are 1 , . . . , n . By reordering if needed, suppose that r > 1, and 1 = i i i r (so the multiplicity of = 1 is r > 1). Let D = diag(1 , . . . , n ), let J = J(1 , r) be the r r Jordan block with eigenvalue 1 and let C = diag(J, r+1 , . . . , n ). Then both D and C have characteristic poly f , but are not similar. This follows from the uniqueness of the Jordan canonical form: any two Jordan matrices are similar i for each i and , they have the same number of i i Jordan blocks with eigenvalue (but they may appear in dierent orders). D and C do not satisfy this, so are not similar. We may also see it directly as follows. Writing C as a 2 2 block matrix with rst block r r,
C v E (C I)v = 0

J(0, r) 0 0 C22 I

v1 v2

= 0.

This is equivalent to requiring J(0, r)v1 = 0 and (C22 I)v2 = 0. But C22 has diagonal entries distinct from (by choice of r), so C22 I is upper triangular invertible. So we must have v2 = 0. Clearly the rank of J(0, r) is r 1, and its nullity is 1, which means C N(J(0, r)) and E have dimension 1. C But we assumed 1 < r, so dim(E ) < mult(), so that C is not diagonalizable. As D is in fact diagonal, they cannot be similar.

MATH 110: PRACTICE FINAL SOLUTIONS

Question 10. Note that if C is an n n matrix with columns ci , et Cej = et cj = Cij , and i i et = e (ei is the standard basis vector, which is real). So i i Bej , ei = e Bej = et Bej = Bij , i i and ej , Aei = (Aei ) ej = e A ej = et A ej = A . i i ij Applying the assumption, Bij = A for each i, j, so B = A . ij

Question 11. Let W = {v V | x, v = 0}. We show that W satises the three conditions required of a subspace. By one of the rst theorems on inner products, 0 W . If u, v W , x, u + v = x, u + x, v = 0 + 0 = 0, so u + v W . If u W and c F , x, cu = c x, u = c0 = 0, so cu W .

You might also like