You are on page 1of 3

Then the two compositions are BA = 0 1 1 0 1 0 0 1 1 0 0 1 = 0 1 1 0 0 1 0 1 = 1 0 1 0

The products arent the same. You can perform these on physical objects. Take Weve looked at the operations of addition and a book. First rotate it 90 then ip it over. Start scalar multiplication on linear transformations and again but ip rst then rotate 90 . The book ends used them to dene addition and scalar multipli- up in dierent orientations. cation on matrices. For a given basis on V and another basis on W , we have an isomorphism Matrix multiplication is associative. Al : Hom(V, W ) Mmn of vector spaces which though its not commutative, it is associative. assigns to a linear transformation T : V W its Thats because it corresponds to composition of standard matrix [T ] . functions, and thats associative. Given any three We also have matrix multiplication which corre- functions f , g, and h, well show (f g) h = sponds to composition of linear transformations. If f (g h) by showing the two sides have the same A is the standard matrix for a transformation S, values for all x. and B is the standard matrix for a transformation T , then we dened multiplication of matrices so ((f g) h)(x) = (f g)(h(x)) = f (g(h(x))) that the product AB is be the standard matrix for while S T. There are a few more things we should look at (f (g h))(x) = f ((g h)(x)) = f (g(h(x))). for matrix multiplication. Its not commutative. It is associative. It distributes with matrix addi- Theyre the same. tion. There are identity matrices I for multiplicaSince composition of functions is associative, and tion. Cancellation doesnt work. You can compute linear transformations are special kinds of funcpowers of square matrices. And scalar matrices. tions, therefore composition of linear transformaD Joyce, Fall 2012 tions is associative. Since matrix multiplication Matrix multiplication is not commutative. corresponds to composition of linear transformaIt shouldnt be. It corresponds to composition of tions, therefore matrix multiplication is associative. linear transformations, and composition of funcAn alternative proof would actually involve tions is not commutative. computations, probably with summation notation, Example 1. Lets take a 2-dimensional geometric something like example. Let T be rotation 90 clockwise, and S be reection across the x-axis. Weve looked at those before. The standard matrices A for S and B for T are A = B = 1 0 0 1 0 1 1 0 1 =
k j

Algebra of linear transformations and matrices Math 130 Linear Algebra

AB =

aij
j k

bjk ckl

=
j,k

aij bjk ckl

aij bjk

ckl .

Matrix multiplication distributes over ma- AI = A = IA are dierent. For example, trix addition. When A, B, and C are the right 1 0 0 4 5 6 shape matrices so the the operations can be per0 1 0 3 1 0 formed, then the the following are always identities: 0 0 1 A(B + C) = AB + AC (A + B)C = AC + BC = = 4 5 6 3 1 0 1 0 0 1 4 5 6 3 1 0

. Why does it work? It suces to show that it works for linear transformations. Suppose that R, S, and T are their linear transformations. The correspond- Cancellation doesnt work for matrix multiplication! Not only is matrix multiplication noning identities are commutative, but the cancellation law doesnt hold for it. Youre familiar with cancellation for numR (S + T ) = (R S) + (R T ) bers: if xy = xz but x = 0, then y = z. But we (R + S) T = (R T ) + (S T ) can come up with matrices so that AB = AC and 1 0 Simply evaluate them at a vector v and see that A = 0, but B = C. For example A = , 0 0 you get the same thing. Heres the rst identity. 1 0 1 0 Youll need to use linearity of R at one point. B= , and C = . 0 3 0 4 (R (S + T ))(v) = = = ((R S) + (R T ))(v) = = R((S + T )(v) R(S(v + T (v) R(S(v)) + R(T (v)) (RS)(v) + (RT )(v) R(S(v)) + R(T (v)) Powers of matrices. Frequently, well multiply square matrices by themselves (you can only multiply square matrices by themselves), and well use the standard notation for powers. The expression Ap stands for the product of p copies of A. Since matrix multiplication is associative, this denition works, so long as p is a positive integer. But we can extend the denition to p = 0 by making A0 = I, and the usual properties will will still hold. That is, Ap Aq = Ap+q and (Ap )q = Apq . Later, well extend powers to the case when A is an invertible matrix and the power p is a negative integer. Warning: because matrix multiplication is not commutative in general, it is usually the case that (AB)p = Ap B p .

The identity matrices. Just like there are matrices that work as additive identities (we denoted them all 0 as described above), there are matrices that work as multiplicative identities, and well denote them all I and all them identity matrices. An identity matrix is a square n by n matrix with 1 down the diagonal and 0 elsewhere. You could denote them In to emphasize their sizes, but you can always tell by the context what its size is, so well leave out the index n. By the way, whenever youve got a square n by n matrix, you can say the order of the matrix is n. Anyway, I acts like an identity matrix AI = A = IA.

Scalar matrices. A scalar matrix is a matrix with the scalar r down the diagonal. Thats the same thing as the scalar r times the identity matrix. For instance, 4 0 0 1 0 0 0 4 0 = 4 0 1 0 = 4I. Note that if A is not a square matrix, then the orders of the two identity matrices I in the identity 0 0 4 0 0 1 2

Among other things, that means that we can identify a scalar matrix with the scalar. Math 130 Home Page http://math.clarku.edu/~djoyce/ma130/ at

You might also like