You are on page 1of 32

SSM: Linear Algebra

Section 7.1

Chapter 7
7.1
1. If ~v is an eigenvector of A, then A~v = ~v .
Hence A3~v = A2 (A~v ) = A2 (~v ) = A(A~v ) = A(A~v ) = A(2~v ) = 2 A~v = 3~v , so ~v is an
eigenvector of A3 with eigenvalue 3 .
3. We know A~v = ~v , so (A + 2In )~v = A~v + 2In~v = ~v + 2~v = ( + 2)~v , hence ~v is an
eigenvector of (A + 2In ) with eigenvalue + 2.
5. Assume A~v = ~v and B~v = ~v for some eigenvalues , . Then (A + B)~v = A~v + B~v =
~v + ~v = ( + )~v so ~v is an eigenvector of A + B with eigenvalue + .
7. We know A~v = ~v so (A In )~v = A~v In~v = ~v ~v = ~0 so a nonzero vector ~v is in
the kernel of (A In ) so ker(A In ) 6= {~0} and A In is not invertible.

 
 
   
a b
1
1
a

9. We want
=
for any . Hence
=
, i.e., the desired matrices
c d
0
0
c
0

b
must have the form
, they must be upper triangular.
0 d

  

a b
2
2
,
11. We want
=
. So, 2a + 3b = 2 and 2c + 3d = 3. Thus, b = 22a
3
c d
3
3


a 22a
3
will fit.
. So all matrices of the form
and d = 32c
3
c 32c
3

 
 
  3 
t
6 6
v1
v1
v1
(with t 6= 0).
13. Solving
=4
, we get
= 5
15 13
v2
v2
v2
t
15. Any vector on L is unaffected by the reflection, so that a nonzero vector on L is an
eigenvector with eigenvalue 1. Any vector on L is flipped about L, so that a nonzero
vector on L is an eigenvector with eigenvalue 1. Picking a nonzero vector from L and
one from L , we obtain a basis consisting of eigenvectors.
17. No (real) eigenvalues
19. Any nonzero vector in L is an eigenvector with eigenvalue 1, and any nonzero vector in
the plane L is an eigenvector with eigenvalue 0. Form a basis consisting of eigenvectors
by picking any nonzero vector in L and any two nonparallel vectors in L .
21. Any nonzero vector in R3 is an eigenvector with eigenvalue 5. Any basis for R3 consists
of eigenvectors.

167

Chapter 7

SSM: Linear Algebra

23. a. Since S = [~v1 ~vn ], S 1~vi = S 1 (S~ei ) = ~ei .


b. ith column of S 1 AS
= S 1 AS~ei
= S 1 A~vi (by definition of S)
= S 1 i~vi (since ~vi is an eigenvector)
= i S 1~vi
= i~ei (by part a)

1
0

hence S 1 AS =
...
0

0
2

0
0

0
0
.

0 n

25. See Figure 7.1.

Figure 7.1: for Problem 7.1.25.


27. See Figure 7.2.
29. See Figure 7.3.
31. See Figure 7.4.


 
1
1
, hence we know that the eigenvalues are 2
+ 6t
1
 1


1
1
and 6 with corresponding eigenvectors
and
respectively (see Fact 7.1.3), so
1
1

33. We are given that ~x(t) = 2t

168

SSM: Linear Algebra

Section 7.1

Figure 7.2: for Problem 7.1.27.

Figure 7.3: for Problem 7.1.29.

Figure 7.4: for Problem 7.1.31.

1 1
we want a matrix A such that A
1
1

1


1 1
4 2
, we get A =
.
1
1
2
4

169


2 6
. Multiplying on the right by
2
6

Chapter 7

SSM: Linear Algebra

35. Let be an eigenvalue of S 1 AS. Then for some nonzero vector ~v , S 1 AS~v = ~v , i.e.,
AS~v = S~v = S~v so is an eigenvalue of A with eigenvector S~v .
Conversely, if is an eigenvalue of A with eigenvector w,
~ then Aw
~ = w,
~ for some nonzero
w.
~
Therefore, S 1 AS(S 1 w)
~ = S 1 Aw
~ = S 1 w
~ = S 1 w,
~ so S 1 w
~ is an eigenvector of
1
S AS with eigenvalue .


0.6
0.8
is a scalar multiple of an orthogonal matrix. By Fact 7.1.2, the
0.8 0.6
possible eigenvalues of the orthogonal matrix are 1, so that the possible eigenvalues
of A are 5. In part b we see that both are indeed eigenvalues.
 


2
1
b. Solve A~v = 5~v to get ~v1 =
, ~v2 =
.
1
2

37. a. A = 5

   
 
0
0
0
. So b = 0, and d = (for any ). Thus, we need
=
=
39. We want

1
1


 
 


 
 

a 0
1 0
0 0
0 0
1 0
0 0
0 0
matrices of the form
=a
+c
+d
. So,
,
,
c d
0 0
1 0
0 1
0 0
1 0
0 1
is a basis of V , and dim(V )= 3.
 
 


 
 
1
1
a b
a b
1
1
= 1
= 2
41. We want
, and
. So, a + b = 1 = c + d and
1
2
c d
c d
1
2
a + 2b = 2 and 22 = c + 2d.


a
c

b
d

So (a + 2b) (a + b) = 2 1 = b, a = 1 b = 21 2 . Also, (c + 2d) (c + d) =


21 2
2 1
=
22 1 = d, c = 1 d = 21 22 . So A must be of the form:
2

2
2
1
2
2 1




2 1
1 1
1
+ 2
.
2 1
2 2

 

2 1
1 1
So a basis of V is
,
, and dim(V )= 2.
2 1
2 2
43. A = AIn = A[ ~e1 . . . ~en ] = [ 1~e1 . . . n~en ], where the eigenvalues 1 , . . . , n are
arbitrary. Thus A can be any diagonal matrix, and dim(V ) = n.
45. Consider a vector w
~ that is not parallel to ~v . We want A[~v w]
~ = [~v a~v + bw],
~ where , a
and b are arbitrary constants. Thus the matrices A in V are of the form A = [~v a~v +
bw][~
~ v w]
~ 1 . Using summary 4.1.6, we see that [~v ~0][~v w]
~ 1 , [~0 ~v ][~v w]
~ 1 , [~0 w][~
~ v w]
~ 1 is a
basis of V , so that dim(V ) = 3.
170

SSM: Linear Algebra

Section 7.1

47. Suppose V is a one-dimensional A-invariant subspace of Rn , and ~v is a non-zero vector


in V . Then A~v will be in V, so that A~v = ~v for some , and ~v is an eigenvector of
A. Conversely, if ~v is any eigenvector of A, then V = span(~v ) will be a one-dimensional
A-invariant subspace. Thus the one-dimensional A-invariant subspaces V are of the form
V = span(~v ), where ~v is an eigenvector of A.
49. The eigenvalues
eigenvectors

 1 = 1.1, and 2 = 0.9 and corresponding

of the system are

100
200
100
, we can see that
, respectively. So if ~x0 =
and ~v2 =
are ~v1 =
100
300
 800


t 100
t 200
~x0 = 3~v1 ~v2 . Therefore, by Fact 7.1.3, we have ~x(t) = 3(1.1)
(0.9)
, i.e.
300
100
c(t) = 300(1.1)t 200(0.9)t and r(t) = 900(1.1)t 100(0.9)t .




0
.75
c(t)
. Now we will proceed
, and A~v (t) = ~v (t + 1), where A =
51. Let ~v (t) =
1.5 2.25
r(t)
as in the example worked on Pages 292 through 295.









100
0
.75
100
150
100
a. ~v (0) =
, and we see that A~v (0) =
=
= 1.5
.
200
200
300
200
1.5 2.25



100
100
.
= (1.5)t
So, ~v (t) = At~v (0) = At
200
200
So c(t) = 100(1.5)t and r(t) = 200(1.5)t.




  


100
0
.75
100
75
100
b. ~v (0) =
, and we see that A~v (0) =
=
= .75
. So,
100
100
75
100



 1.5 2.25
100
100
~v (t) = At~v (0) = At
= (.75)t
.
100
100
So c(t) = 100(.75)t and r(t) = 100(.75)t.




500
100
c. ~v (0) =
. We can write this in terms of the previous eigenvectors as ~v (0) = 3
+

100 




 700


100
100
100
100
100
.
+ 2(1.5)t
= 3(.75)t
+ At 2
. So, ~v (t) = At~v (0) = At 3
2
200
100
200
100
200
So c(t) = 300(.75)t + 200(1.5)t and r(t) = 300(.75)t + 400(1.5)t .

a(t)
53. Let ~v (t) = b(t) be the amount of gold each has after t days. And A~v (t) = ~v (t + 1).
c(t)




0 1 1
1
1
1
a(t + 1) = 12 b(t) + 21 c(t), etc, so that A = 21 1 0 1 . A 1 = 1 , so 1
1 1 0
1
1
1
171

Chapter 7

SSM: Linear Algebra

1
1
2
has eigenvalue 1 = 1. A 1 = 12 , so 1 has eigenvalue 2 = 21 . Also,
0
0
0
1

2
1
1
A 0 = 0 , so 0 has eigenvalue 3 = 21 .
1
1
1
2

6
1
1
1
a. ~v (0) = 1 = 3 1 + 2 1 + 0 .
2
1
0
1

1
1
1
So, ~v (t) = At~v (0) = At 3 1 + 2 1 + 0
1
0
1


1
1
1
1
1
1
= 3At 1 + 2At 1 + At 0 = 3t1 1 + 2t2 1 + t3 0
1
0
1
1
0
1

1
1
1
= 3 1 + 2( 21 )t 1 + ( 21 )t 0 .
1
0
1

So a(t) = 3 + 3( 21 )t , b(t) = 3 2( 21 )t and c(t) = 3 ( 21 )t .


b. a(365) = 3 + 3( 12 )365 = 3
c(365) = 3 ( 21 )365 = 3 +

3
2365 ,

1
2365 .

b(365) = 3 2( 21 )365 = 3 +

1
2364

and

So, Benjamin will have the most gold.

7.2
1. 1 = 1, 2 = 3 by Fact 7.2.2.


5
4
3. det(AI2 ) = det
= 2 4+3 = (1)(3) = 0 so 1 = 1, 2 = 3.
2
1


11
15
= 2 4 + 13 so det(A I2 ) = 0 for no real .
5. det(A I2 ) = det
6
7
7. = 1 with algebraic multiplicity 3, by Fact 7.2.2.
9. fA () = ( 2)2 ( 1) so
1 = 2 (Algebraic multiplicity 2)
172

SSM: Linear Algebra

Section 7.2

2 = 1.
11. fA () = 3 2 1 = ( + 1)(2 + 1) = 0
= 1 (Algebraic multiplicity 1).
13. fA () = 3 + 1 = ( 1)(2 + + 1) so = 1 (Algebraic multiplicity 1).

2 44(1k)
15. fA () = 2 2 + (1 k) = 0 if 1,2 =
=1 k
2
The matrix A has 2 distinct real eigenvalues when k > 0, no real eigenvalues when k < 0.

17. fA () = 2 a2 b2 = 0 so 1,2 = a2 + b2 .

The matrix A represents a reflection about a line followed by a scaling by a2 + b2 , hence


the eigenvalues.
19. True, since fA () = 2 tr(A) + det(A) and the discriminant [tr(A)]2 4 det(A) is
positive if det(A) is negative.
21. If A has n eigenvalues, then fA () = (1 )(2 ) (n ). Then fA () =
()n + (1 + 2 + + n )()n1 + + (1 2 n ). But, by Fact 7.2.5, the
coefficient of ()n1 is tr(A). So, tr(A) = 1 + + n .
23. fB () = det(B In ) = det(S 1 AS In )
= det(S 1 AS S 1 In S)
= det(S 1 (A In )S) = det(S 1 ) det(A In ) det(S)
= (det S)1 det(A In ) det(S) = det(A In ) = fA ()
Hence, since fA () = fB (), A and B have the same eigenvalues.
 
  
 

 
b
b
(a + c)b
ab + cb
b
is an
since a + c = b + d = 1; therefore,
=
=
=
25. A
c
c
(b + d)c
cb + cd
c
eigenvector with eigenvalue 1 = 1.





 

1
1
ab
1
is an
since a b = (c d); therefore,
= (a b)
=
Also, A
1
1
cd
1
eigenvector with eigenvalue 2 = a b. Note that |a b| < 1; a possible phase portrait
is shown in Figure 7.5.



 
 
1
1
1
1
, 2 = 4 . If ~x0 =
, 1 = 1 and ~v2 =
27. a. We know ~v1 =
then ~x0 = 31 ~v1 + 23 ~v2 ,
1
2
0
so by Fact 7.1.3,
173

Chapter 7

SSM: Linear Algebra

Figure 7.5: for Problem 7.2.25.


x1 (t) =

1
3

2
3


1 t
4

t
23 14 .
 
0
If ~x0 =
then ~x0 = 31 ~v1 31 ~v2 , so by Fact 7.1.3,
1
t
x1 (t) = 31 13 14
t
x2 (t) = 23 + 13 14 . See Figure 7.6.
x2 (t) =

2
3

Figure 7.6: for Problem 7.2.27a.

174

SSM: Linear Algebra

b. At approaches

Section 7.2
1
3


1 1
, as t . See part c for a justification.
2 2

c. Let us think about the first column of At , which is At~e1 . We can use Fact 7.1.3 to
compute At~e1 .


 
1
b
; a straightforward computation shows that
+ c2
Start by writing ~e1 = c1
1
c
c
1
and c2 = b+c
.
c1 = b+c


 
1
b
1
c
t
t
, where 2 = a b.
Now A ~e1 = b+c
+ b+c (2 )
1
c
 
b
1
t
Since |2 | < 1, the second summand goes to zero, so that lim (A ~e1 ) = b+c
.
t
c

 

b
b b
1
1
, so that lim At = b+c
.
Likewise, lim (At~e2 ) = b+c
c
t
c c
t

29. The ith entry of A~e is [ai1 ai2 ain ]~e =

n
X

aij = 1, so A~e = ~e and = 1 is an eigenvalue

j=1

of A, corresponding to the eigenvector ~e.


31. Since A and AT have the same eigenvalues (by Exercise 22), Exercise 29 states that = 1
is an eigenvalue of A, and Exercise 30 says that
 || 1 for all eigenvalues . Vector ~e
0.9 0.9
need not be an eigenvector of A; consider A =
.
0.1 0.1
33. a. fA () = det(A I3 ) = 3 + c2 + b + a

0
b. By part a, we have c = 17, b = 5 and a = , so M = 0

1 0
0 1 .
5 17

0 1 0
0
0 0
0
1
35. A =
, with fA () = (2 + 1)2
0
0 0 1
0
0 1
0

37. We can write fA () = ( 0 )2 g(), for some polynomial g. The product rule for
derivatives tells us that fA0 () = 2( 0 )g() + ( 0 )2 g 0 (), so that fA0 (0 ) = 0, as
claimed.
175

Chapter 7

SSM: Linear Algebra

39. tr(AB) =tr



a b
c d



e
g

f
h



tr(BA) =tr



e
g



a
c

b
d



f
h

are equal.

=tr



ae + bg

cf + dh



= ae + bg + cf + dh.

=tr



ea + f c

gb + hd



= ea + f c + gb + hd. So they

41. So there exists an invertible S such that B = S 1 AS, and tr(B) =tr(S 1 AS) =tr((S 1 A)S).
By Exercise 40, this equals tr(S(S 1 A)) =tr(A).
43. tr(AB BA) =tr(AB)tr(BA) =tr(AB)tr(AB) = 0, but tr(In ) = n, so no such A, B
exist. We have used Exercise 40.
45. fA () = 2 tr(A)+det(A) = 2 2+(34k). We want fA (5) = 251034k = 0,
or, 12 4k = 0, or k = 3.


2 0
47. Let M = [ ~v1 ~v2 ]. We want A[ ~v1 ~v2 ] = [ ~v1 ~v2 ]
, or, [ A~v1 A~v2 ] = [ 2~v1 3~v2 ].
0 3
Since ~v1 or ~v2 must be nonzero, 2 or 3 must be an eigenvalue of A.
49. As in problem 47, such an M will exist if A has an eigenvalue 2, 3 or 4.

7.3
1. 1 = 7, 2 = 9, E7 = ker


 


 
4
2 8
1
0 8
= span
, E9 = ker
= span
1
0 0
0
0 2

   
1
4
Eigenbasis:
,
0
1
3. 1 = 4, 2 = 9, E4 = span
Eigenbasis:


 
1
3
, E9 = span
1
2

  
1
3
,
1
2

5. No real eigenvalues as fA () = 2 2 + 2.
7. 1 = 1, 2 = 2, 3 = 3, eigenbasis: ~e1 , ~e2 , ~e3

1
0
1
9. 1 = 2 = 1, 3 = 0, eigenbasis: 0 , 1 , 0
0
0
1
176

SSM: Linear Algebra

Section 7.3



1
1
1
11. 1 = 2 = 0, 3 = 3, eigenbasis: 1 , 0 , 1
0
1
1

0
1
1
13. 1 = 0, 2 = 1, 3 = 1, eigenbasis: 1 , 3 , 1
0
1
2

0
15. 1 = 0, 2 = 3 = 1, E0 = span 1 . We can use Kyle Numbers to see that
0
1
2
E1 = ker
3
4

1
0
1
0

2
1
1
= span 1 .
1
2
2

There is no eigenbasis since the eigenvalue 1 has algebraic multiplicity 2, but the geometric
multiplicity is only 1.
17. 1 = 2 = 0, 3 = 4 = 1


1
0
0
0
0 1 1 0
with eigenbasis ,
, ,
0
1
0
0
0
0
0
1
19. Since 1 is the only eigenvalue, with algebraic multiplicity 3, there exists an eigenbasis for
A if (and only if) the geometric
multiplicity
of the eigenvalue 1 is 3 as well, that is, if

0 a b
E1 = R3 . Now E1 = ker 0 0 c is R3 if (and only if) a = b = c = 0.
0 0 0
If a = b = c = 0 then E1 is 3-dimensional with eigenbasis ~e1 , ~e2 , ~e3 .
If a 6= 0 and c 6= 0 then E1 is 1-dimensional and otherwise E1 is 2-dimensional. The
geometric multiplicity of the eigenvalue 1 is dim(E1 ).

 

   
 
   
1 4
1 2
4
2
2
1
1
=
, i.e. A
=
=2
and A
=
21. We want A such that A
2 6
2 3
6
3
3
2
2

1 


5 2
1 2
1 4
.
=
so A =
6 2
2 3
2 6
The answer is unique.
23. 1 = 2 = 1 and E1 = span(~e1 ), hence there is no eigenbasis. The matrix represents a
shear parallel to the x-axis.
177

Chapter 7

SSM: Linear Algebra

1
25. If is an eigenvalue of A, then E = ker(A I3 ) = ker 0
a
b

0
1 .
c

The second and third columns of the above matrix arent parallel, hence E is always
1-dimensional, i.e., the geometric multiplicity of is 1.
27. By Fact 7.2.4, we have fA () = 2 5 + 6 = ( 3)( 2) so 1 = 2, 2 = 3.
29. Note that r is the number of nonzero diagonal entries of A, since the nonzero columns of
A form a basis of im(A). Therefore, there are n r zeros on the diagonal, so that the
algebraic multiplicity of the eigenvalue 0 is n r. It is true for any n n matrix A that
the geometric multiplicity of the eigenvalue 0 is dim(ker(A)) = n rank(A) = n r.
31. They must be the same. For if they are not, by Fact 7.3.7, the geometric multiplicities
would not add up to n.
33. If S 1 AS = B, then
S 1 (A In )S = S 1 (AS S) = S 1 AS S 1 S = B In .
35. No, since the two matrices have different eigenvalues (see Fact 7.3.6c).

37. a. A~v w
~ = (A~v )T w
~ = (~v T AT )w
~ = (~v T A)w
~ = v T (Aw)
~ = ~v Aw
~

A symmetric
b. Assume A~v = ~v and Aw
~ = w
~ for 6= , then (A~v ) w
~ = (~v ) w
~ = (~v w),
~ and
~v Aw
~ = ~v w
~ = (~v w).
~
By part a, (~v w)
~ = (~v w)
~ i.e., ( )(~v w)
~ = 0.
Since 6= , it must be that ~v w
~ = 0, i.e., ~v and w
~ are perpendicular.
39. a. There are two eigenvalues, 1 = 1 (with E1 = V ) and 2 = 0 (with E0 = V ).
Now geometric multiplicity(1) = dim(E1 ) = dim(V ) = m, and
geometric multiplicity(0) = dim(E0 ) = dim(V ) = n dim(V ) = n m.
Since geometric multiplicity() algebraic multiplicity(), by Fact 7.3.7, and the
algebraic multiplicities cannot add up to more than n, the geometric and algebraic
multiplicities of the eigenvalues are the same here.
b. Analogous to part a: E1 = V and E1 = V .
178

SSM: Linear Algebra

Section 7.3

geometric multiplicity(1) = algebraic multiplicity(1) = dim(V ) = m, and


geometric multiplicity(1) = algebraic multiplicity(1) = dim(V ) = n m.

9
2
1
41. The eigenvalues of A are 1.2, 0.8, 0.4 with eigenvectors 6 , 2 , 2 .
2
1
2

9
2
1
9
2
Since ~x0 = 50 6 +50 2 +50 2 we have ~x(t) = 50(1.2)t 6 +50(0.8)t 2 +
1
2
2
1
2

1
50(0.4)t 2 , so, as t goes to infinity, j(t) : n(t) : a(t) approaches the proportion
2
9 : 6 : 2.

0 1 1
43. a. A = 21 1 0 1
1 1 0

7
7.6660156
b. After 10 rounds, we have A10 11 7.6699219 .
5
7.6640625

7
7.66666666667
After 50 rounds, we have A50 11 7.66666666667 .
5
7.66666666667

1
1
0
c. The eigenvalues of A are 1 and 21 with E1 = span 1 and E 21 = span 1 , 1
2
1
1

0
1
 1
t
t
so ~x(t) = 1 + c30 1 + 21 1 + 12 c30 1 .
1
1
2

1001 
After 1001 rounds, Alberich will be ahead of Brunnhilde by 12
, so that Carl
needs to beat Alberich to win the game. A straightforward computation shows that
1001
(1 c0 ); Carl wins if this quantity is positive, which is the
c(1001) a(1001) = 12
case if c0 is less than 1.
179

Chapter 7

45. a.
b.
c.

d.

SSM: Linear Algebra

Alternatively, observe that the ranking of the players is reversed in each round: Whoever is first will be last after the next round. Since the total number of rounds is odd
(1001), Carl wants to be last initially to win the game; he wants to choose a smaller
number than both Alberich and Brunnhilde.


 
0.1 0.2 ~
1
A=
,b =
0.4 0.3
2


A ~b
B=
0 1


 
1
1
.
and
The eigenvalues of A are 0.5 and 0.1 with associated eigenvectors
1
2
 
 
 
~v
~v
~v
The eigenvalues of B are 0.5, 0.1, and 1. If A~v = ~v then B
=
so
is
0
0
0
an eigenvector of B.

2
Furthermore, 4 is an eigenvector of B corresponding to the eigenvalue 1. Note that

1
(A I2 )1~b
.
this vector is
1

2
1
1
x1 (0)
Write ~y(0) = x2 (0) = c1 2 + c2 1 + c3 4 .
1
0
0
1
Note that c3 = 1.


 
1
1
2
2
2
t
t
Now ~y(t) = c1 (0.5)t 2 + c2 (0.1)t 1 + 4 4 so that ~x(t)
.
4
0
0
1
1

1 1

0
r(t)
2
4

47. a. If ~x(t) = p(t) , then ~x(t + 1) = A~x(t) with A = 12 12 21 .


w(t)
0 1 1
4


1
1
1
The eigenvalues of A are 0, 12 , 1 with eigenvectors 2 , 0 , 2 .
1
1
1



1
1
1
1
1

t
Since ~x(0) = 0 = 14 2 + 12 0 + 14 2 , ~x(t) = 12 12 0 +
1
1
1
1
0
t > 0.
180


1
1
2 for
4
1

SSM: Linear Algebra

Section 7.4

b. As t the ratio is 1 : 2 : 1 (since the first term of ~x(t) drops out).


49. This random matrix A = [~0 ~v2 ~vn ] is unlikely to have any zeros above the diagonal. In this case, the columns ~v2 , . . . , ~vn will be linearly independent (none of them is
redundant), so that rank(A) = n 1 and geometric multiplicity(0) = dim(ker(A)) =
n rank(A) = 1. Alternatively, you can argue in terms of rref(A).

7.4
1. Matrix A is diagonal already, so its certainly diagonalizable. Let S = I2 .
  

1
1
. If we
,
3. Diagonalizable. The eigenvalues are 0,3, with associated eigenvectors
2
1




1 1
0 0
let S =
, then S 1 AS = D =
.
1 2
0 3
5. Fails to be diagonalizable. There is only one eigenvalue, 1, with a one-dimensional
eigenspace.
  

4
1
7. Diagonalizable. The eigenvalues are 2,3, with associated eigenvectors
,
. If
1
1




4 1
2
0
we let S =
, then S 1 AS = D =
.
1
1
0 3
9. Fails to be diagonalizable. There is only one eigenvalue, 1, with a one-dimensional
eigenspace.
11. Fails to be diagonalizable. The eigenvalues are 1,2,1, and the eigenspace
E1 = ker(A I3 ) = span(~e1 ) is only one-dimensional.

1
1
1
13. Diagonalizable. The eigenvalues are 1,2,3, with associated eigenvectors 0 , 1 , 2 .
0
0
1

1 0 0
1 1 1
If we let S = 0 1 2 , then S 1 AS = D = 0 2 0 .
0 0 3
0 0 1

2
1
0
15. Diagonalizable. The eigenvalues are 1, 1, 1, with associated eigenvectors 1 , 1 , 0 .
0
0
1

2 1 0
1
0 0
If we let S = 1 1 0 , then S 1 AS = D = 0 1 0 .
0 0 1
0
0 1
181

Chapter 7

SSM: Linear Algebra

1
1
1
17. Diagonalizable. The eigenvalues are 0,3,0, with associated eigenvectors 1 , 1 , 0 .
0
1
1

0 0 0
1 1
1
0 , then S 1 AS = D = 0 3 0 .
If we let S = 1 1
0 0 0
0 1 1
19. Fails to be diagonalizable. The eigenvalues are 1,0,1, and the eigenspace E 1 = ker(A I3 )
= span(~e1 ) is only one-dimensional.
21. Diagonalizable for all values of a, since there are always two distinct eigenvalues, 1 and
2. See Fact 7.4.3.
2
23. Diagonalizable for positive
a. The characteristic polynomial is ( 1) a, so that the
eigenvalues are = 1 a. If a is positive, then we have two distinct real eigenvalues,
so that the matrix is diagonalizable. If a is negative, then there are no real eigenvalues.
If a is 0, then 1 is the only eigenvalue, with a one-dimensional eigenspace.

25. Diagonalizable for all values of a, b, and c, since we have three distinct eigenvalues, 1, 2,
and 3.
27. Diagonalizable only if a = b = c = 0. Since 1 is the only eigenvalue, it is required that
E1 = R3 , that is, the matrix must be the identity matrix.
29. Not diagonalizable for any
a. The characteristic polynomial is 3 + a, so that there is

3
only one real eigenvalue, a, for all a. Since the corresponding eigenspace isnt all of R3 ,
the matrix fails to be diagonalizable.


1 2
are 1 and 5,
31. In Example 2 of Section 7.3 we see that the eigenvalues of A =
4 3


 

1
1
1 1
with associated eigenvectors
and
. If we let S =
, then S 1 AS =
1
2
1
2


1 0
.
D=
0 5




1 1
(1)t 0
2 1
Thus A = SDS 1 and At = SDt S 1 = 31
1 2
0
55
1
1


t
t
t+1
2(1) + 5
(1)
+ 5t
= 13
2(5t ) 2(1)t 2(5t ) + (1)t




2
1 2
and
are 0 and 7, with associated eigenvectors
33. The eigenvalues of A =
1
3 6
 



1
2 1
0 0
. If we let S =
, then S 1 AS = D =
. Thus A = SDS 1 and
3
1 3
0 7
182

SSM: Linear Algebra

Section 7.4

 t




2 1
7
2(7t )
0 0
3 1
1
= 7t1 A. We can
=
7 3(7t ) 6(7t )
1 3
0 7t
1 2
find the same result more directly by observing that A2 = 7A.


1 6
35. Matrix
has the eigenvalues 3 and 2. If ~v and w
~ are associated eigenvectors, and
2 6






1 6
3 0
1 6
if we let S = [~v w],
~ then S 1
S=
, so that matrix
is indeed
2 6
0 2
2 6


3 0
.
similar to
0 2
At = SDt S 1 =

1
7

37. Yes. Matrices A and B have the same characteristic polynomial,


2 7 + 7, so that

7 21
they have the same two distinct real eigenvalues
1,2 = 2 . Thus both A and B are

1 0
similar to the diagonal matrix
, by Algorithm 7.4.4. Therefore A is similar to
0 2
B, by parts b and c of Fact 3.4.6.
39. The eigenfunctions with eigenvalue are the nonzero functions f (x) such that T (f (x)) =
f 0 (x) f (x) = f (x), or f 0 (x) = ( + 1)f (x). From calculus we recall that those are the
exponential functions of the form f (x) = Ce(+1)x , where C is a nonzero constant. Thus
all real numbers are eigenvalues of T , and the eigenspace E is one-dimensional, spanned
by e(+1)x .
41. The nonzero symmetric matrices are eigenmatrices with eigenvalue 2, since L(A) = A +
AT = 2A in this case. The nonzero skew-symmetric matrices have eigenvalue 0, since
T
L(A) =
 L is diagonalizable, since we have the eigenbasis
 A
 + A  = A A = 0. Yes,

0 1
0 0
0 1
1 0
(three symmetric matrices, and one skew-symmetric
,
,
,
1 0
0 1
1 0
0 0
one).
43. The nonzero real numbers are eigenvectors with eigenvalue 1, and the nonzero imaginary
numbers (of the form iy) are eigenvectors with eigenvalue 1. Yes, T is diagonalizable,
since we have the eigenbasis 1,i.
45. The nonzero sequence (x0 , x1 , x2 , . . .) is an eigensequence with eigenvalue if
T (x0 , x1 , x2 , . . .) = (0, x0 , x1 , x2 , . . .) = (x0 , x1 , x2 , . . .) = (x0 , x1 , x2 , . . .). This
means that 0 = x0 , x0 = x1 , x1 = x2 , . . . , xn = xn+1 , . . . . If is nonzero, then
these equations imply that x0 = 1 0 = 0, x1 = 1 x0 = 0, x2 = 1 x1 = 0, . . . , so that there
are no eigensequences in this case. If = 0, then we have x0 = x1 = 0, x1 = x2 =
0, x2 = x3 = 0, . . . , so that there arent any eigensequences either. In summary: There
are no eigenvalues and eigensequences for T .
47. The nonzero even functions, of the form f (x) = a+cx2 , are eigenfunctions with eigenvalue
1, and the nonzero odd functions, of the form f (x) = bx, have eigenvalue 1. Yes, T is
diagonalizable, since the standard basis, 1, x, x2 , is an eigenbasis for T .
183

Chapter 7

SSM: Linear Algebra

1 1
1
49. The matrix of T with respect to the standard basis 1, x, x2 is B = 0
3 6 . The
0 0 9
1
1
1
eigenvalues of B are 1, 3, 9, with corresponding eigenvectors 0 , 2 , 4 . The
4
0
0
eigenvalues of T are 1,3,9, with corresponding eigenfunctions 1, 2x 1, 4x2 4x + 1 =
(2x 1)2 . Yes, T is diagonalizable, since the functions 1, 2x 1, (2x 1)2 from an
eigenbasis.
51. The nonzero constant functions f (x) = b are the eigenfunctions with eigenvalue 0. If f (x)
is a polynomial of degree 1, then the degree of f (x) exceeds the degree of f 0 (x) by 1
(by the power rule of calculus), so that f 0 (x) cannot be a scalar multiple of f (x). Thus
0 is the only eigenvalue of T , and the eigenspace E0 consists of the constant functions.
53. Suppose basis D consists of f1 , . . . , fn . We are told that the D-matrix D of T is diagonal;
let 1 , 2 , . . . , n be the diagonal entries of D. By Fact 4.3.3., we know that [T (fi )]D =
(ith column of D) = i~ei , for i = 1, 2, . . . , n, so that T (fi ) = i fi , by definition of
coordinates. Thus f1 , . . . , fn is an eigenbasis for T , as claimed.




1 0
0 1
, for example.
and B =
55. Let A =
0 0
0 0

 


I
A
Im A
AB 0
= m
57. Modifying the hint in Exercise 56 slightly, we can write
0 In
0 In
B 0




0
0
AB 0
0
0
. Thus matrix M =
is similar to N =
. By Fact 7.3.6a,
B BA
B 0
B BA
matrices M and N have the same characteristic polynomial.


AB Im
0
= ()n det(AB Im ) = ()n fAB (). To
Now fM () = det
B
In
understand the second equality, consider Fact 6.1.8. Likewise, fN ()


Im
0
= ()m fBA ().
= det
B
BA In
It follows that ()n fAB () = ()m fBA (). Thus matrices AB and BA have the same
nonzero eigenvalues, with the same algebraic multiplicities.
If mult(AB) and mult(BA) are the algebraic multiplicities of 0 as an eigenvalue of AB
and BA, respectively, then the equation ()n fAB () = ()m fBA () implies that
n + mult(AB) = m + mult(BA).
59. If ~v is an eigenvector with eigenvalue , then
184

SSM: Linear Algebra

Section 7.4

fA (A)~v = ((A)n + an1 An1 + + a1 A + a0 In )~v


= ()n~v + an1 n1~v + + a1 ~v + a0~v
= (()n + an1 n1 + + a1 + a0 )~v
= fA ()~v = 0~v = ~0.
Since A is diagonalizable, any vector ~x in Rn can be written as a linear combination of
eigenvectors, so that fA (A)~x = ~0. Since this equation holds for all ~x in Rn , we have
fA (A) = 0, as claimed.

61. a. B is diagonalizable since it has three distinct eigenvalues, so that S 1 BS is diagonal


for some invertible S. But S 1 AS = S 1 I3 S = I3 is diagonal as well. Thus A and B
are indeed simultaneously diagonalizable.
b. There is an invertible S such that S 1 AS = D1 and S 1 BS = D2 are both diagonal. Then A = SD1 S 1 and B = SD2 S 1 , so that AB = (SD1 S 1 )(SD2 S 1 ) =
SD1 D2 S 1 and BA = (SD2 S 1 )(SD1 S 1 ) = SD2 D1 S 1 . These two results agree,
since D1 D2 = D2 D1 for the diagonal matrices D1 and D2 .


1 0
and
c. Let A be In and B a nondiagonalizable n n matrix, for example, A =
0 1


1 1
B=
.
0 1
d. Suppose BD = DB for a diagonal D with distinct diagonal entries. The ij th entry of
the matrix BD = DB is bij djj = dii bij . For i 6= j this implies that bij = 0. Thus B
must be diagonal.
e. Since A has n distinct eigenvalues, A is diagonalizable, that is, there is an invertible S
such that S 1 AS = D is a diagonal matrix with n distinct diagonal entries. We claim
that S 1 BS is diagonal as well; by part d it suffices to show that S 1 BS commutes
with D = S 1 AS. This is easy to verify:
(S 1 BS)D = (S 1 BS)(S 1 AS) = S 1 BAS = S 1 ABS = (S 1 AS)(S 1 BS) =
D(S 1 BS).

63. Recall from Exercise 62 that all the eigenspaces are two-dimensional.

185

Chapter 7

SSM: Linear Algebra

a. We need to solve the differential equation f 00 (x) = f (x). As in Example 18 of Section 4.1, we will look for exponential solutions. The function f (x) = ekx is a solution
if k 2 = 1, or k = 1. Thus the eigenspace E1 is the span of functions ex and ex .
b. We need to solve the differential equation f 00 (x) = 0. Integration gives f 0 (x) = C, a
constant. If we integrate again, we find f (x) = Cx + c, where c is another arbitrary
constant. Thus E0 = span(1, x).
c. The solutions of the differential equation f 00 (x) = f (x) are the functions f (x) =
a cos(x) + b sin(x), so that E1 = span(cos x, sin x). See the introductory example of
Section 4.1 and Exercise 4.1.58.
d. Modifying part c, we see that the solutions of the differential equation f 00 (x) = 4f (x)
are the functions f (x) = a cos(2x) + b sin(2x), so that E4 = span(cos(2x), sin(2x)).

65. Lets write S in terms of its columns, as S = [ ~v w


~ ].


5
0
We want A [ ~v w
~ ] = [ ~v w
~]
, or, [ A~v Aw
~ ] = [ 5~v
0 1

w
~ ] , that is, we want
 
1
~v to be in the eigenspace E5 , and w
~ in E1 . We find that E5 = span
and E1 =


  



2


1
1
1
1 0
0
1
span
, so that S must be of the form a
b
=a
+b
.
1
2
1
2 0
0 1

 
1 0
0
1
Thus, a basis of the space V is
,
, and dim(V ) = 2.
2 0
0 1

67. Let E1 = span(~v1 , ~v2 , ~v3 ) and E2 = span(w


~1, w
~ 2 ). As in Exercise 65, we can see that S
must be of the form [ ~x1 ~x2 ~x3 ~x4 ~x5 ] where ~x1 , ~x2 and ~x3 are in E1 and ~x4 and ~x5
are in E2 . Thus, we can write ~x1 = c1~v1 + c2~v2 + c3~v3 , for example, or ~x5 = d1 w
~ 1 + d2 w
~2.
Using Summary 4.1.6, we find a basis: [ ~v1 ~0 ~0 ~0 ~0 ] , [ ~v2 ~0 ~0 ~0 ~0 ] ,
[ ~v3 ~0 ~0 ~0 ~0 ] , [ ~0 ~v1 ~0 ~0 ~0 ] , [ ~0 ~v2 ~0 ~0 ~0 ] , [ ~0 ~v3 ~0 ~0 ~0 ] ,
~ 1 ~0 ] ,
[ ~0 ~0 ~v1 ~0 ~0 ] , [ ~0 ~0 ~v2 ~0 ~0 ] , [ ~0 ~0 ~v3 ~0 ~0 ] , [ ~0 ~0 ~0 w
[ ~0 ~0 ~0 w
~ 2 ~0 ] , [ ~0 ~0 ~0 ~0 w
~ 1 ] , [ ~0 ~0 ~0 ~0 w
~2 ] .
Thus, the dimension of the space of matrices S is 3 + 3 + 3 + 2 + 2 = 13.

7.5
1. z = 3 3i so |z| =

32 + (3)2 =

18 and arg(z) = 4 ,
186

SSM: Linear Algebra


so z =

Section 7.5



18 cos 4 + i sin 4 .

3. If z = r(cos + i sin ), then z n = rn (cos(n) + i sin(n)).


z n = 1 if r = 1, cos(n) = 1, sin(n) = 0 so n = 2k for an integer k, and =


i.e. z = cos 2k
+ i sin 2k
n
n , k = 0, 1, 2, . . . , n 1. See Figure 7.7.

2k
n ,

Figure 7.7: for Problem 7.5.3.





+ i sin +2k
, k = 0, 1, 2, . . . , n 1.
5. Let z = r(cos + i sin ) then w = n r cos +2k
n
n

7. |T (z)| = |z| 2 and arg(T (z)) = arg(1 i) + arg(z) = 4 + arg(z) so T is a clockwise

rotation by 4 followed by a scaling of 2.


0.7
0.72. See Figure 7.8.
9. |z| = 0.82 + 0.72 = 1.15, arg(z) = arctan 0.8

Figure 7.8: for Problem 7.5.9.


The trajectory spirals outward, in the clockwise direction.
187

Chapter 7

SSM: Linear Algebra

11. Notice that f (1) = 0 so = 1 is a root of f (). Hence f () = ( 1)g(), where


()
= 2 2 + 5. Setting g() = 0 we get = 1 2i so that f () =
g() = f1
( 1)( 1 2i)( 1 + 2i).
13. Yes, Q is a field. Check the axioms on Page 347.
15. Yes, check the axioms on Page 347. (additive identity 0 and multiplicative identity 1)
17. No, since multiplication is not commutative; Axiom 5 does not hold.

19. a. Since A has eigenvalues 1 and 0 associated with V and V respectively and since V is
the eigenspace of = 1, by Fact 7.5.5, tr(A) = m, det(A) = 0.
b. Since B has eigenvalues 1 and 1 associated with V and V respectively and since V is
the eigenspace associated with = 1, tr(A) = m(nm) = 2mn, det B = (1) nm .

21. fA () = (11 )(7 ) + 90 = 2 4 + 13 so 1,2 = 2 3i.


23. fA () = ()3 + 1 = ( 1)(2 + + 1) so 1 = 1, 2,3 =

1 3i
.
2

25. fA () = 4 1 = (2 1)(2 + 1) = ( 1)( + 1)( i)( + i) so 1,2 = 1 and 3,4 = i


27. By Fact 7.5.5, tr(A) = 1 + 2 + 3 , det(A) = 1 2 3 but 1 = 2 6= 3 by assumption,
so tr(A) = 1 = 22 + 3 and det(A) = 3 = 22 3 .
Solving for 2 , 3 we get 1, 3 hence 1 = 2 = 1 and 3 = 3. (Note that the eigenvalues
must be real; why?)
29. tr(A) = 0 so 1 + 2 + 3 = 0.
Also, we can compute det(A) = bcd > 0 since b, c, d > 0. Therefore, 1 2 3 > 0.
Hence two of the eigenvalues must be negative, and the largest one (in absolute value)
must be positive.
t
1
1
31. No matter how we choose A, 15
A is a regular transition matrix, so that lim 15
A is
t

a matrix with identical columns by Exercise 30. Therefore, the columns of At become
ij th entry of At
more and more alike as t approaches infinity, in the sense that lim ikth entry of At = 1
t
for all i, j, k.

188

SSM: Linear Algebra

Section 7.5

33. a. C is obtained from B by dividing each column of B by its first component. Thus, the
first row of C will consist of 1s.
b. We observe that the columns of C are almost identical, so that the columns of B are
almost parallel (that is, almost scalar multiples of each other).
c. Let 1 , 2 , . . . , 5 be the eigenvalues. Assume 1 real and positive and 1 > |j | for
2 j 5.
Let ~v1 , . . . , ~v5 be corresponding eigenvectors. For a fixed i, write ~ei =

5
X

cj ~vj ; then

j=1

(ith column of At ) = At~ei = c1 t1~v1 + + c5 t5~v5 .


But in the last expression, for large t, the first term is dominant, so the ith column of
At is almost parallel to ~v1 , the eigenvector corresponding to the dominant eigenvalue.
d. By part c, the columns of B and C are almost eigenvectors of A associated with the
largest eigenvalue, 1 . Since the first row of C consists of 1s, the entries in the first
row of AC will be close to 1 .

35. We have fA () = (1 )(2 ) (n )


= ()n + (1 + 2 + + n )()n1 + + (1 2 n ). But, by Fact 7.2.5, the
coefficient of ()n1 is tr(A). So, tr(A) = 1 + + n .
37. a. Use that w + z = w + z and wz = wz.

 
 

w1 z 1
w2 z 2
w1 + w2 (z1 + z2 )
+
=
is in H.
w1
w2
z1 + z 2
w1 + w 2
z1
z2


 

w1 z 1
w2 z 2
w1 w2 z 1 z2 (z1 w2 + w 1 z2 )
=
is in H.
z1
z2
w1
w2
z1 w2 + w 1 z2
w1 w2 z 1 z2
b. If A in H is nonzero, then det(A) = ww + zz = |w|2 + |z|2 > 0, so that A is invertible.




w z
w z
1
, then A1 = |w|2 +|z|
is in H.
c. Yes; if A =
2
z
w
z w






0 i
0 1
i
0
and
, then AB =
and B =
d. For example, if A =
i
0
1
0
0 i


0 i
BA =
.
i 0
189

Chapter 7

SSM: Linear Algebra

Figure 7.9: for Problem 7.5.39.

39. Figure 7.9 illustrates how Cn acts on the standard basis vectors ~e1 , ~e2 , . . . , ~en of Rn .

a. Based on Figure 7.9, we see that Cnk takes ~ei to ~ei+k modulo n, that is, if i + k
exceeds n then Cnk takes ~ei to ~ei+kn (for k = 1, . . . , n 1).
To put it differently: Cnk is the matrix whose ith column is ~ei+k if i + k n and ~ei+kn
if i + k > n (for k = 1, . . . , n 1).
b. The characteristic polynomial is 1 n , so that the eigenvalues are the n distinct
solutions of the equation n = 1 (the so-called
nth roots of unity), equally spaced

2k
points along the unit circle, k = cos 2k
n +i sin
n , for k = 0, 1, . . . , n1 (compare
with Exercise 5 and Figure 7.7.). For each eigenvalue k ,

kn1
.
..

~vk = 2 is an associated eigenvector.


k
k
1

c. The eigenbasis ~v0 , ~v1 , . . . , ~vn1 for Cn we found in part b is in fact an eigenbasis for
all circulant n n matrices.

41. Substitute =
14
x2

12
x3

1
x

into 142 + 123 1 = 0;

1=0

14x + 12 x3 = 0
x3 14x = 12
190

SSM: Linear Algebra

Section 7.6

Now use the formula derived in Exercise 40 to find x, with p = 14 and q = 12. There
is only one positive solution, x 4.114, so that = x1 0.243.
43. Note that f (z) is not the zero polynomial, since f (i) = det(S1 + iS2 ) = det(S) 6= 0, as
S is invertible. A nonzero polynomial has only finitely many zeros, so that there is a
real number x such that f (x) = det(S1 + xS2 ) 6= 0, that is, S1 + xS2 is invertible. Now
SB = AS or (S1 + iS2 )B = A(S1 + iS2 ). Considering the real and the imaginary part, we
can conclude that S1 B = AS1 and S2 B = AS2 and therefore (S1 + xS2 )B = A(S1 + xS2 ).
Since S1 + xS2 is invertible, we have B = (S1 + xS2 )1 A(S1 + xS2 ), as claimed.

45. If a 6= 0, then there are


1 a, so that the matrix is diagonal two distinct
 eigenvalues,

1 1
1 1
izable. If a = 0, then
=
fails to be diagonalizable.
a 1
0 1

47. If a 6= 0, then there are


three distinct
eigenvalues,
0, a, so that the matrix is diago
0 0 0
0 0 0
nalizable. If a = 0, then 1 0 a = 1 0 0 fails to be diagonalizable.
0 1 0
0 1 0
49. The eigenvalues are 0, 1, a 1. If a is neither 1 nor 2, then there are three distinct
eigenvalues, so that the matrix is diagonalizable. Conversely, if a = 1 or a = 2, then the
matrix fails to be diagonalizable, since all the eigenspaces will be one-dimensional (verify
this!).

7.6
1. 1 = 0.9, 2 = 0.8, so, by Fact 7.6.2, ~0 is a stable equilibrium.

3. 1,2 = 0.8 (0.7)i so |1 | = |2 | = 0.64 + 0.49 > 1 so ~0 is not a stable equilibrium.


5. 1 = 0.8, 2 = 1.1 so ~0 is not a stable equilibrium.

7. 1,2 = 0.9 (0.5)i so |1 | = |2 | = 0.81 + 0.25 > 1 and ~0 is not a stable equilibrium.
9. 1,2 = 0.8 (0.6)i, 3 = 0.7, so |1 | = |2 | = 1 and ~0 is not a stable equilibrium.
11. 1 = k, 2 = 0.9 so ~0 is a stable equilibrium if |k| < 1.
13. Since 1 = 0.7, 2 = 0.9, ~0 is a stable equilibrium regardless of the value of k.

1
15. 1,2 = 1 10
k

1
If k 0 then 1 = 1 + 10
k 1. If k < 0 then |1 | = |2 | > 1. Thus, the zero state isnt
a stable equilibrium for any real k.
191

Chapter 7

SSM: Linear Algebra

17. 1,2 = 0.6 (0.8)i = 1(cos i sin ), where





0.8
4
= arctan 0.8
0.6 = arctan 0.6 = arctan 3 0.927.




 


0.8i 0.8
1
0
1
E1 = ker
= span
so w
~=
, ~v =
.
0.8 0.8i
i
1
0
 
0
= 1w
~ + 0~v , so a = 1 and b = 0. Now we use Fact 7.6.3:
~x0 =
1

 

  


sin t
cos t
0 1
1
cos(t) sin(t)
0 1
, where
=
=
~x(t) =
cos t
sin t
1 0
0
sin(t) cos(t)
1 0

= arctan 43 0.927.
The trajectory is the circle shown in Figure 7.10.

Figure 7.10: for Problem 7.6.17.


19. 1,2 = 2 3i, r =
1

13, and = arctan

3
2

0.98, so

    



t sin(0.98t)

1
a
0 1
and ~x(t) 13
=
,
13(cos(0.98)+i sin(0.98)), [w
~ ~v ] =
.
0
b
1
0
cos(0.98t)

The trajectory spirals outwards; see Figure 7.11.


21. 1,2 = 4 i, r = 17, = arctan 41 0.245 so
    


1
a
0 5
=
,
~ ~v ] =
1 17(cos(0.245) + i sin(0.245)), [w
0
b
1 3
192

SSM: Linear Algebra

Section 7.6

Figure 7.11: for Problem 7.6.19.

and ~x(t)

17

5 sin(0.245t)
cos(0.245t) + 3 sin(0.245t)

The trajectory spirals outwards; see Figure 7.12.

Figure 7.12: for Problem 7.6.21.



0.3
0.643
23. 1,2 = 0.4 0.3i, r = 12 , = arctan 0.4

    



0 5
a
5 sin(t)
1
1 t
[w
~ ~v ] =
,
=
.
so ~x(t) = 2
1 3
b
cos(t) + 3 sin (t)
0
The trajectory spirals inwards as shown in Figure 7.13.
25. Not stable since if is an eigenvalue of A, then
193


is an eigenvalue of A1 and 1 =

1
||

> 1.

Chapter 7

SSM: Linear Algebra

Figure 7.13: for Problem 7.6.23.


27. Stable since if is an eigenvalue of A, then is an eigenvalue of A and | | = ||.
1

3

0
0
29. Cannot tell; for example, if A = 2 1 , then A + I2 is 2 3 and the zero state is
0 2
1
 0 2
 1
0
0
2
then A + I2 = 2 1 and the zero state is stable.
not stable, but if A =
0 12
0 2
31. We need to determine for which values of det(A) and tr(A) the modulus of both eigenvalues is less than 1. We will first think about the border line case and examine when
one of the moduli is exactly 1: If one of the eigenvalues is 1 and the other is , then
tr(A) = + 1 and det(A) = , so that det(A) = tr(A) 1. If one of the eigenvalues is 1
and the other is , then tr(A) = 1 and det(A) = , so that det(A) = tr(A) 1. If
the eigenvalues are complex conjugates with modulus 1, then det(A) = 1 and |tr(A)| < 2
(think about it!). It is convenient to represent these conditions in the tr-det plane, where
each 2 2 matrix A is represented by the point (trA, detA), as shown in Figure 7.14.
If tr(A) = det(A) = 0, then both eigenvalues of A are zero. We can conclude that
throughout the shaded triangle in Figure 7.14 the modulus of both eigenvalues will be
less than 1, since the modulus of the eigenvalues changes continuously with tr(A) and
det(A) (consider the quadratic formula!). Conversely, we can choose sample points to
show that in all the other four regions in Figure 7.14 the modulus of at least one of the
eigenvalues exceeds one; consider

 
 



2 0
2 0
2
0
0 2
the matrices
,
,
, and
.
0 0
0 0
0 2
2
0

in (I)
in (II)
in (III)
in (IV)
194

SSM: Linear Algebra

Section 7.6

Figure 7.14: for Problem 7.6.31.


It follows that throughout these four regions, (I), (II), (III), and (IV), at least one of the
eigenvalues will have a modulus exceeding one.
The point (trA, detA) is in the shaded triangle if det(A) < 1, det(A) > tr(A) 1 and
det(A) > tr(A) 1. This means that |trA| 1 < det(A) < 1, as claimed.
33. Take conjugates of both sides of the equation ~x0 = c1 (~v + iw)
~ + c2 (~v iw):
~
~ + c2 (~v iw)
~ = c1 (~v iw)
~ + c2 (~v + iw)
~ = c2 (~v + iw)
~ + c1 (~v iw).
~
~x0 = ~x0 = c1 (~v + iw)
The claim that c2 = c1 now follows from the fact that the representation of ~x0 as a linear
combination of the linearly independent vectors ~v + iw
~ and ~v iw
~ is unique.

35. a. Let ~v1 , . . . , ~vn be an eigenbasis for A. Then ~x(t) =

n
X

ci ti~vi and

i=1

k~x(t)k = |

n
X
i=1

ci ti~vi |

n
X
i=1

kci ti~vi k =

n
X
i=1

|i |t kci~vi k

n
X
i=1

1
The last quantity,

n
X
i=1

kci~vi k, gives the desired bound M .


195

kci~vi k.

Chapter 7

SSM: Linear Algebra





  

1 1
k
k+1
represents a shear parallel to the x-axis, with A
=
, so that
0 1   
1
1
0
t
~x(t) = At
=
is not bounded. This does not contradict part a, since there is
1
1
no eigenbasis for A.

b. A =

37. a. Write Y (t + 1) = Y (t) = Y, C(t + 1) = C(t) = C, I(t + 1) = I(t) = I.




Y = C + I + G0


Y = Y G+0 G0

C = Y


Y = 1


I =0
Y =

G0
1 , C

b. y(t) = Y (t)

G0
1 , I

=0

G0
1 , c(t)

= C(t)

G0
1 , i(t)

= I(t)

Substitute to verify the equations.



 


C(t + 1)

c(t)
=
i(t + 1)

i(t)


0.2 0.2
c. A =
eigenvalues 0.6 0.8i
4 1
not stable



d. A =
, trA = 2, detA = , stable (use Exercise 31)
1



trA = (1 + ) > 0, detA =


e. A =

Use Exercise 31; stable if det(A) = < 1 and trA 1 = + 1 < .
The second condition is satisfied since < 1.
Stable if < 1

eigenvalues are real if

4
(1+)2

39. Use Exercise 38: ~v = (I2 A)1~b =

0.9 0.2
0.4
0.7

1    
1
2
=
.
2
4

 
2
is a stable equilibrium since the eigenvalues of A are 0.5 and 0.1.
4
196

SSM: Linear Algebra

True or False

41. Find the 2 2 matrix A that transforms


A

 






8
3
3
8
into
and
into
:
6
4
4
6

1



 
8 3
3 8
3 8
8 3
=
and A =
=
6
4
4 6
4 6
6
4

1
50


36 73
.
52 36

There are many other correct answers.

True or False
1. T, by Fact 7.2.2


1 1
, then eigenvalue 1 has geometric multiplicity 1 and algebraic multiplicity. 2.
3. F; If
0 1
5. T; A = AIn = A[~e1 . . . ~en ] = [1~e1 . . . n~en ] is diagonal.
7. T; Consider a diagonal 5 5 matrix with only two distinct diagonal entries.
9. T, by Summary 7.1.5


1 1
.
11. F; Consider A =
0 1
13. T; If A~v = 3~v, then A2~v = 9~v.
15. T, by Fact 7.5.5
17. T, by Example 6 of Section 7.5
19. T; If S 1 AS = D, then S T AT (S T )1 = D.




0 0
0 1
2
.
, with A =
21. F; Consider A =
0 0
0 0


1 0
23. F; Let A =
, for example.
1 1
25. T; If S 1 AS = D, then S 1 A1 S = D1 is diagonal.
27. T; The sole eigenvalue, 7, must have geometric multiplicity 3.
29. F; Consider the zero matrix.
31. F; Consider the identity matrix.
197

Chapter 7

SSM: Linear Algebra




 
1
, for example.
0


 
 
2 0
1
0
35. F; Let A =
, ~v =
, and w
~=
, for example.
0 3
0
1
33. F; Let A =

1 1
0 1

and ~v =

37. T; The eigenvalues are 3 and 2.


39. T, by Fact 7.3.4
41. F; Consider a rotation through /2.




1 0
1 1
43. F; Consider
and
.
0 1
0 1
45. T; There is an eigenbasis ~v1 , . . . , ~vn , and we can write ~v = c1~v1 + + cn~vn . The vectors
ci~vi are either eigenvectors or zero.
47. T, by Fact 7.3.6a
49. T; Recall that the rank is the dimension of the image. If ~v is in the image of A, then A~v
is in the image of A as well, so that A~v is parallel to ~v .
51. T; If A~v = ~v for a nonzero ~v , then A4~v = 4~v = ~0, so that 4 = 0 and = 0.
53. T; If the eigenvalue associated
with ~v is = 0, then A~v = ~0, so that ~v is in the kernel of

1
A; otherwise ~v = A ~v , so that ~v is in the image of A.
55. T; Either A~u = 3~u or A~u = 4~u.

57. T; Suppose A~vi = i~vi and B~vi = i~vi , and let S = [~v1 . . . ~vn ].
Then ABS = BAS = [1 1~v1 . . . n n~vn ], so that AB = BA.

198

You might also like