You are on page 1of 12

Review for Final Exam

Summary of Material Covered in Course This is only intended as an aid to studying. It may not be an exhaustive list of all we have covered. You are responsible for all denitions and statement of theorems. Many of the points below describe a routine type of problem that we have done in this course with questions of that type listed below it. Question references are to the text, to the assignments or practice midterms problems or practice nal problems. Eg. A2:3 refers to Problem 3 on Assignment 2 and PM2:3 refers to Problem 3 on practice problems for Midterm 2. Note: some questions are listed below multiple times in dierent categories. This was on purpose. I dont intend that you do all these problems, naturally. Go through the list and make sure you understand each point. If you do, go to the next point. First go through the questions like it on the assignments, review problems or midterms and then the text. Chapter 0: 1. Know the properties of matrix addition and multiplication and be able to prove statements about the product or sum of matrices of specic types from rst principles. 2. Understand summation notation and its properties and be able to prove identities with this notation. 3. Understand block multiplication of matrices and be able to work with it. Some problems of this type: A1:2,3,5, A2: 5, PM1:6,8-12; 2.4.10, 2.3:14,15; 2.4:18-23 4. Know the basic properties of the transpose of a matrix as well as the denitions of symmetric, skew-symmetric matrices and be able to prove statements about them. A Mnn (R) is symmetric if AT = A, it is skew-symmetric if AT = A. 5. Know basic properties of the trace of a square matrix and be able to n prove statements about them. Recall: tr(A) = i=1 aii . Trace is a T linear map, tr(A ) = tr(A) and tr(AB) = tr(BA). Some problems of this type: A1:1,4; PM1:5; 1.3:3-7; 2.3:13;

6. Know the basic properties of complex arithmetic and complex conjugation for complex numbers and matrices and be able to prove statements about them. [See Appendix D: Should be able to add, subtract, multiply, divide complex numbers and take their complex conjugates and lengths. I assume you know how complex conjugation and absolute value interacts with the arithmetic as stated in D2 and D3 (a),(b) of Appendix D. The complex conjugate of a matrix A Mmn (C) is A Mmn (C) where Aij = aij . Some problems of this type: A8:3(a),(b); 2.1:38; 2.2:9. Chapter 3: 7. Be able to row reduce a matrix to row echelon or reduced row echelon form. Be able to solve a system of linear equations Ax = b by row reducing the corresponding augmented matrix [A|b]. Some problems of this type: A2:1, PM1:1 8. Determine conditions on constants so that a system of linear equations has no solution, innitely many solutions, or a unique solution. Some problems of this type: A2:4, MT1:2 9. Determine conditions on constants so that a matrix has a given rank where rank is the number of non-zero rows in the RREF or its number of pivot columns. Some problems of this type: A2:3 10. Know the 3 types of elementary row/column operations and their corresponding elementary matrices. Be able to express a series of row/column operations in terms of left and right multiplication by the corresponding elementary matrices. Some problems of this type: A3:1(b), MT1:3 11. For an m n matrix, know that a system Ax = 0 has a non-trivial solution i rank(A) < n and a system Ax = b has a solution i rank(A) = rank[A|b]. The number of parameters in a consistent system Ax = b is n rank(A). Note that rank(A) min{m, n}. Some problems of this type: PM1: 3(b),(c).

12. Determine whether or not a square matrix is invertible and if so, nd its inverse. Be able to express an invertible matrix and its inverse as a product of elementary matrices. Some problems of this type: PM1:3,9; 3.1:3; 3.2:7 13. Given a formula for a square matrix A and a formula for its proposed inverse B, verify that A is invertible and that its inverse is B. Some problems of this type: PM1:7, MT1:4 14. Know the properties of inverses with respect to matrix multiplication, and transposition. Some problems of this type: PM1:7;2.4:4,5,6,7,9,10; Chapter 1: 15. Know the vector space axioms and their consequences. 16. Know the 5 types of standard vector spaces, i.e. F n , Mmn (F ), Pn (F ), F(A, F ), Seq(F) and the operations of addition and scalar multiplication in each. 17. Determine whether or not a given set with 2 operations is a vector space. If so, prove all axioms hold. If not, give an explicit numerical counterexample in which one of the axioms fail. Some problems of this type: A3:1,5; 1.2:11-21. 18. Determine whether a subset of a given vector space is a subspace. If so, prove that it is by checking the axioms. If not, provide an explicit numerical counterexample to one of the axioms. Some problems of this type: A3:2,3; PM1:4; 1.3: 8,10-16,22, 2.2: 15(a), MT1: 1 19. Determine whether a set of vectors {v1 , . . . , vk } in a vector space V is linearly independent or linearly dependent. If it is linearly dependent, be able to nd a linear dependence relation for the set. Some problems of this type: A4:1(c). PM2:1(b),(c); 1.5:2,3,4,5,6,19,20. 2.2:13,14.

20. Determine whether a vector v V is in Span{v1 , . . . , vk }. If so, be able to nd coecients c1 , . . . , ck such that v = c1 v1 + . . . + ck vk . Some problems of this type: 1.4:4, 5,6,7,8,9,10. 21. Determine whether a set of vectors {v1 , . . . , vk } in a vector space V spans V or not. If so, be able to prove this. If not, nd a vector v V which is not in Span{v1 , . . . , vk }. Some problems of this type: 1.4:3,4,5,6,7,8,9,10,13,14,15. 22. Be able to exhibit a basis of a subspace U of a standard vector space V and to calculate the dimension of U . Some problems of this type: A4:2-5; A5:5(c); PM2:2; MT2:1(a),4(a); 1.6:2,3,13,14,15,16,17,26,30. 23. Alternatively, given a subset of a subspace U of V , be able to show that is a linearly independent spanning set of U and hence a basis. If the dimension of U is already known to you and || = dim(U), you need only show that is either linearly independent or spans U . If you dont already know dim(U), you must show both properties. Some problems of this type: A4:3-5; A5:5(c), A8: 5(b). 24. Find bases and the dimensions of sums and intersections and quotients of subspaces. Some problems of this type: A4:4,5; 1.6:29,30,31,32,35. 25. Show that a subspace of a vector space is a direct sum of two subspaces. Some problems of this type: A2:3,4; A4: 4(b). 1.3: 24,25,26,27,28,29; 2.1:24-27,35. 26. Given a linearly independent set of vectors {v1 , . . . , vk } in an n-dimensional vector space V , be able to extend this set (if necessary) to a basis for V . That is, if k = n, nd vectors {vk+1 , . . . , vn } such that {v1 , . . . , vn } is a basis for V . Some problems of this type: A4:1(a); PM2:1(a); MT2:1(b); 3.4:1013.

27. Given a spanning set of vectors {v1 , . . . , vk } for a vector space V , be able to reduce this set (if necessary) to a basis for V . Some problems of this type: A4:1(b); 3.4:8,9. 1.6:7,8. 28. Given an m n matrix A, determine a basis for its row space, its column space, and its null space respectively and nd the rank of the matrix. Given its RREF, be able to express columns of A as linear combinations of the basis of Col(A) Recall: To nd a basis for Row(A) or Col(A), we rst row reduce the matrix A to row-echelon form R. We had a theorem that said that the non-zero rows of R form a basis of Row(A) and that the columns aj1 , . . . , ajr of A corresponding to pivot columns Cj1 , . . . , Cjr of R form a basis for Col(A). The rank of A is the common dimension of Row(A) and Col(A). It is then given by the number of non-zero rows of R or equivalently the number of pivots of R. To nd a basis for Null(A), row reduce the matrix A to reduced row echelonform and then solve the homogeneous system. The dimension of Null(A) is the number of parameters in the homogeneous system or n rank(A). If column k of RREF B is r bik ei and the pivot columns are j1 , . . . , jr then column i=1 k of A is r bik aji . i=1 Some problems of this type: PM2:8; 3.4:5,6. 29. Understand the Lagrange interpolation formula and be able to apply it to nd the polynomial of smallest degree with a given set of distinct points. [Note that for distinct scalars ci F ,i = 0, . . . , n, the (xc polynomials fi (x) = j=i (cj cii)) , i = 0, . . . , n satisfy fi (cj ) = ij and so one can show that this set of polynomials is linearly independent in Pn (F ) and hence a basis for this n + 1 dimensional space and for any g Pn (F ), g = n g(ci )fi . In particular for any scalars bi F , i=0 i = 0, . . . , n the unique polynomial in Pn (F ) that satises g(ci ) = bi is n i=0 bi fi . That is, the polynomial in Pn (F ) that goes through points (ci , bi ), i = 0, . . . , n with ci distinct is n bi fi . i=0 Some problems of this type: 1.6:10. Linear Transformations 30. Determine whether a given map between vector spaces is a linear transformation or not. If it is, prove this by checking the 2 axioms. If is 5

isnt, provide an explicit numerical counterexample. Some problems of this type: MT2:4(b),(c). PM2:3. A5:1(a),(b). A5:4,5(a). 2.1:2-6,9,15. 31. Given a linear transformation T : V W , and X a subspace of V , Y a subspace of W , nd a basis for and the dimension of T (X) as a subspace of W and T 1 (Y ) as a subspace of V . Some problems of this type: PM2:6(b),(c). 32. Given a linear transformation T : V W , and its values on a spanning set {v1 , . . . , vn } for V , nd T (v) for a given v V . Some problems of this type: A5:1(c); 2.1:10-12. 33. Given a basis {v1 , . . . , vn } for a vector space V , and given T (v1 ) = w1 , . . . , T (vn ) = wn () , where w1 , . . . , wn W , another vector space, nd a linear transformation T : V W that satises (*). Some problems of this type: A5:1(c); PM2:10(a); 2.1:10-12. 34. Show that two linear transformations T : V W and S : V W are equal by showing that they agree on a basis, i.e. by showing that T (vi ) = S(vi ) for all i = 1, . . . , n where {v1 , . . . , vn } is a basis for V . Some problems of this type: A5:2; A6:5(b). 35. Given a linear transformation T : V W , nd N (T ), R(T ), and a basis for each. Also nd nullity(T) = dim(N(T)) and rank(T) = dim(R(T)). Some problems of this sort: A5:3,4,5(a),(b);MT2:2,4(b),(c),PM2:4; 2.1:2-6;13-16. 36. Given a linear transformation T : V W , determine whether it is 1-1, onto, or an isomorphism. Use the dimension theorem : dim(V) = rank(T) + nullity(T). Some problems of this type: A5:4,5(a),(b); MT2:4(b),(c),(d);PM2:4,12.

37. Given linear transformations T : V W , and S : W U , compute the composite S T : V U . Some problems of this type: A6:2. 2.3:3,4. 38. Given a linear transformation T : V W , determine whether it is an isomorphism and if so, nd its inverse T 1 : W V . 39. Given a linear transformation T : V V , verify that it satises some polynomial equation. Use this information to show that T is invertible and nd T 1 : V V . 40. Given a basis of a vector space V , nd the coordinate vector [v] for v V . [Notation: if = {v1 , . . . , vn }, v = i=1 ci vi , then [v] = [c1 , . . . , cn ]T .] Some problems of this type: A6:1; 2.3:2,3. 41. Given a basis of an n dimensional vector space V , and the coordinate vector [v] , nd v V . 42. Find the matrix of a linear transformation T : V W with respect to ordered bases and of V and W respectively. That is, nd [T ] . [Notation: [T ] = [[T (v1 )] , . . . , [T (vn )] ] where = {v1 , . . . , vn }. Recall: [T (v)] = [T ] [v] for all v V . We write [T ] = [T ] for short if T : V V . ] Some problems of this type: A6:2-5. 2.2:2-5; 9-12; 16. 43. Given a matrix of a linear transformation T : V W with respect to ordered bases and of V and W respectively, [T ] , nd the linear transformation T . 44. Find the matrix of a composite of linear transformations T : V W and S : W U with respect to bases , , of V, W, U respectively using the formula: [ST ] = [S] [T ] . Some problems of this type: A6:2;PM2:6 45. Find the matrix of an inverse T 1 : W V of an isomorphism T : V W with respect to bases of V and of W by using the formula [T 1 ] = ([T ] )1 . 7

Some problems of this type: A6:3; MT2:3. 46. Show that a linear transformation T : V W is invertible and nd its inverse T 1 : W V by nding the matrix of T with respect to bases of V and of W , [T ] and its inverse and by using the formula [T 1 ] = ([T ] )1 . Some problems of this type: A6:3; MT2:3. 47. Given a vector space V and two bases and , nd the change of basis matrix [IV ] and use the formula [v] = [IV ] [v] to nd the coordinates of v V with respect to the new basis . Some problems of this type: PM2:5. 48. Find the change of basis matrix [IV ] by using the formula [IV ] = [IV ] [IV ] . given bases , , of V and the 2 matrices [IV ] and [IV ] . Find the change of basis matrix [IV ] given [IV ] by using the formula [IV ] = ([IV ] )1 . Some problems of this type: PM2:6 49. Given two bases 0 , of a vector space V and two bases 0 , of a vector space W , nd the matrix of a linear transformation T : V W wrt and , [T ] by rst nding [T ]0 and then using the formula 0 0 0 [T ] = [IW ]0 [T ]0 [IV ] . Some problems of this type: PM2:6. 50. Given A Mnn (F ), with P 1 AP = D, nd a basis of F n such that [LA ] = D. In fact take P as the matrix with as its columns. Then P = [IV ]0 where 0 is the standard basis of F n . Some problems of this type: PM2:7. 51. Given a linear transformation T : V W , nd a basis of V and a basis of W such that [T ] has a nice form: eg diagonal, block. Some problems of this type: 2.2:11,12,16 Determinants

52. Compute the determinant of a square matrix using a cofactor expansion down a row or column. Cofactor expansion down ith row:
n

det(A) =
j=1

(1)i+j aij det(Aij )

Cofactor expansion down jth column:


n

det(A) =
i=1

(1)i+j aij det(Aij )

where A Mnn (F ) and Aij Mn1,n1 (F ) is the matrix obtained by omitting row i and column j from A. Some problems of this type: A7:1(a),2(c),5(a). 53. Compute the determinant of a square matrix by row/column reduction, keeping track of the eects of such operations on the determinant and then computing the determinant of a simple matrix (eg. triangular). Note: A A
Ci Cj Ri Rj

B implies det(B) = det(A) A


Ri Ri +cRj

Ri aRi

B im-

plies det(B) = a det(A) and A


Ci Ci +cCj

B implies det(B) = det(A).


Ci aCi

B implies det(B) = det(A) A

B implies det(B) =

a det(A) and A B implies det(B) = det(A). det(U ) = n i=1 uii if U is an upper triangular matrix with diagonal entries U . Combine cofactor expansions with this method Some problems of this type: A7: 1(b); 4.2:13-22, 4.2:25-30 54. Use properties of determinants of products, inverses, scalar multiples and transposes to compute determinants: A, B n n. (det(AB) = det(A) det(B), det(A1 ) = det(A)1 for A invertible, det(kA) = k n det(A), det(AT ) = det(A). Some problems of this type: A7: 2,3(b),4.3: 9-13 55. Find the adjoint of an n n matrix A, via Adj(A)ij = (1)i+j det(Aji ) Use this formula to nd the inverse of an invertible matrix A, i.e. A1 = Adj(A)/ det(A). Some problems of this type: A7: 1(c), 2(c),3(c), PF 4, 4.3 26 9

56. Use formula Adj(A)A = det(A)In = AAdj(A) to prove things about the adjoint. Some problems of this type: A7: 2( b), 3(c), PF 9, 4.3 25, 27 57. Cramers Rule: For an invertible matrix A, solve Ax = b by x = [x1 , . . . , xn ]T , 3 xi = for all i = 1, . . . , n. Some problems of this type: A7:1(c), PF 3; 4.3: 2-7 Eigenvalues and Eigenvectors: 58. Given an n n matrix A, nd its characteristic polynomial, and its eigenvalues with their corresponding algebraic and geometric multiplicities. For each eigenvalue, nd the corresponding eigenspace. Be able to determine whether the matrix is diagonalizable from the multiplicities of the eigenvalues and the dimensions of the corresponding eigenspaces. If the matrix is diagonalizable, nd an invertible matrix P and a diagonal matrix D such that P 1 AP = D. Some problems of this type: A8:1; 5.1:3;5.2:2 59. Given T L(V, V ), nd its characteristic polynomial, its eigenvalues with their corresponding algebraic and geometric multiplicities. For each eigenvalue, nd the corresponding eigenspace. Be able to determine whether the linear transformation is diagonalisable from the multiplicities of the eigenvalues and the dimensions of the corresponding eigenspaces. If the linear transformation is diagonalisable, nd a basis such that [T ] is diagonal. Note that most of this can be done by passing to a matrix with respect to a (standard) basis 0 . cT (x) = c[T ] (x) so that the eigenvalues of T and [T ]0 are the same. Also 0 |E (T ) : E (T ) E ([T ]0 ) is an isomorphism as proved in class so that we can nd a basis for each E ([T ]0 ) and use the isomorphism to nd a basis for E (T ). Some problems of this type: A8:2. PF:1, 5.1:4; 5.2:3 det[a1 , . . . , ai1 , b, ai+1 , . . . , an ] det(A)

10

60. Be able to show that 2 matrices are not similar by showing that one of the following properties fails to be equal for the 2 matrices: trace, rank, determinant, characteristic polynomial, or eigenvalues. Some problems of this type: PF:2 61. Given the eigenvalues of a square matrix A Mnn (F ) or T : V V , nd those of a polynomial in A or T . Here by a polynomial in A or T , I mean if p(x) = n ci xi then p(A) = n ci Ai Mnn (F ) and i=0 i=0 p(T ) = n ci T i L(V, V ) where A0 = In and T 0 = IV . Show what i=0 happens to a polynomial in A under similarity. Show what happens to a polynomial in T under : L(V ) Mnn (F )for a basis of V . For example if p(x) = x2 2x + 1 then p(A) = A2 2A + In and p(T ) = T 2 2T + IV . Some problems of this type: A8: 4(b); 5.1:15,22; 5.2:12,13. 62. Find the characteristic polynomial cT (x) of a linear transformation T : V V . Find its determinant, rank, eigenvalues. Do this by nding the corresponding property for [T ] where is any basis of V . (This works since these are all similarity invariants.) Some problems of this type: 5.1:7,12,16,20,21; 63. If a matrix or linear transformation satises a polynomial equation, use this to nd the eigenvalues. For a matrix similar to an upper triangular matrix (or better yet diagonalisable, use information on the determinant and trace to nd the algebraic multiplicities of the eigenvalues. Some problems of this type: A8:4, 5.1:17 64. Diagonalize a matrix A as P 1 AP = D and nd the nth power of A (An = P Dn P 1 ). Some problems of this type: 5.2:7. 65. Diagonalize a matrix A as P 1 AP = D and use this to solve a dierential equation x (t) = Ax(t). Some problems of this type: A8:3; 5.2:14,15,16. Theoretical Problems

11

Not all of the problems on the exam will t into such neat categories. There will be some theoretical problems as well. These type of problems on the exam will be similar to assignment problems of that sort. In other words, you will have to prove some things directly from denitions or from a direct application of the theorems covered in this class. These type of problems are best studied for by learning the statements of denitions and theorems and looking back to questions from assignments or midterms.

12

You might also like