You are on page 1of 47

Matrix Multiplication

Hyun Lee, Eun Kim, Jedd Hakimi

2.1 Matrix Operations


Key Idea
Matrix multiplication corresponds to composition of linear transformation. The definition of AB, is critical for the development of theory and application. Then what is the definition of AB?

What is AB?
The subscripts tell the location of an entry.
If A is m x n matrix, then m represents the row and n represents the column. In the product AB, left-multiplication by A acts on the columns of while right multiplication by B actions on the rows of A. In other words..

Definition of AB continues..

Columns j Columns j of AB = A x of B

Also, the following is true. (row i of AB)=(row i of A) x B

How do we add matrix?


Let me show you an example:
suppose A= 2 3 B= 4 7 7 8 6 5 then what is A+B? = 2+4 7+3 7+6 8+5

= 6 10 13 13

Example of Bx
X= 1 1 1
3 2 5 4x= 4 4 4 12 8 20 then.. What is 4x?

Matrix Multiplication

How do we multiply the matrix? When matrix x is multiply by B, we are transforming x into vector Bx. Lets say we multiply by Bx by A again, now we have created A(Bx). This is the key concept to move on to next steps!

We can always express A(Bx) and (AB)x to represent this composite mapping as a matrix that was multiplied by a single factor.

Example

If A is a 4 x 5 matrix and B is a 5 x 3

matrix, what are the sizes of AB and BA, if they are defined?

Remember! You cant multiply any matrix. There are some conditions!

The number of columns on the first matrix has to be same as the number of rows of the second matrix.

Theorem 1
A,B, and C are the matrices of same size and let r and s
be the scalars. A+B= B+A (A+B)+C=A+(B+C) A+0=A r(A+b)=rA+rB (r+s)A= rA+sA R(sA)=(rs)A 1.1 1.2 1.3 1.4 1.5 1.6

Lets prove it!


Note that all the vectors of A, B, and C has
the same size. Lets say 2nd column of vector of A, B, C is A2, B2, and C2(respectively). Then 3(A2+B2)= 3A2+3B2 The first matrix has the same size as the matrix on the right: The corresponding columns are equal so from 1.1-1.6 it can be proved using same logic as I proved 1.4

AXB
A= 3 0 B= 4 7 1 1 6 8 5 2 A x B= 3(4)+0(6), 3(7)+0(8) 1(4)+1(6), 1(7)+1(8) 5(4)+2(6), 5(7)+2(8) = 12 21 10 15 32 51

Row-Column Rule
If we think of the ith row of A and the jth column of B as vectors, then the element in the ith row and the jth column of C is, scalar product of the ith row of A and jth column of B. (AB)ij=Ai1Bij+Ai2Bi2+.+AinBnj Rowi(AB)=rowi(a) * B

Theorem 2
A, B, and Cs are matrices they have sizes for which the indicates sums and products are defined A(BC)=(AB)C 2.1 A(B+C)=AB+AC 2.2 (B+C)A=BA+CA 2.3 r(AB)=(rA)B=A(rB) 2.4 ImA=A=AIn 2.5

Lets prove it
A(BC)=(AB)C ------Associative law First observe that both A(BC) and (AB)C are m x n matrices. Let Uj denote column j of AB. Since Uj==ABj, column j of A(BC)is AUj= A(BCj). Further more, column j of (AB)C is (AB)Cj=A(Bcj). It follows that the corresponding columns of A(BC) and (AB)C are equal.

Example of Associative Law


A=(1,2), B= 3 4
C= 3 0 2 21 510 AB=(1,2) 3 4 = (7,6) 2 1 (AB)C=(7,6) 3 0 2 =(51,6,4) 5 1 0 A(BC)=(1,2) 29 4 6 =(51,6,14) 11 1 4

Proof of Distributive Law


Both (A+B)C and AC +BC are m *n matrices, and so we compare the corresponding columns of each matrix. For any j, ACj and BCj are the jth columns of AC and BC, respectively. But .. *jth column of (A+B)C is (A+B)Cj=ACj+BCj. By property of matrix jth column of AC+BC is equal to the above. This left distributive law applies to right distributive law, too.

Example of Distributive Law


A=(1,2) B= 3 4 C= 4 2 0 5 1 7 B+C= 7 6 A(B+C)=(9,30) 1 12 AB=(3,14) AC=(6,16) AB+AC=(9,30)

AB is not equal to BA
A= 1
2 0 1 BA =(3,4,1,5) 1 = (3+8+5)=16 2 0 1 B= (3,4,1,5); AB= 3 6 0 3 4 8 0 4 1 2 0 1 5 10 0 5

The Transpose of a Matrix


Sometimes it is of interest to interchange the

rows and columns of a matrix The transpose of a matrix A=Aij is a matrix formed from A by inter changing rows and columns such that row i of A becomes columns I of the transpose matrix. The transpose is denoted by At and At=Aji when A= Aij

Example of the Transpose


A= 1 3
2 5

At= 1 2
3 5

A= 1 3 4 At= 1 0
0 1 0 3 1 4 0

It will be observed that if A is m x n, At is nxm

Matrix Powers
If matrix A is squared it is denoted as .. =A2 Ak=A x A x A xA
A0=1(its the conventionright??) Therefore, A0X=X (itself) Also, you we can apply that (Ap)x(Aq)= A (p+q)

Theorem 3
Suppose A and B represent matrices of appropriate sizes for the following sums and products.

Theorems (a)-(c) are obvious, so the proofs are not required. For theorem (d),

2.4 Partitioned Matrices


I see youve chosen the Red Pill Welcome to Partitioned Matrices -Not as exciting as an action movie, but it might still make you say Whoa.

By now youre teeming with anticipation, eager to expand your mind. Lets start by clearing a few things up by understanding what we are working with. Why dont I ask your first question for you (because you might feel silly talking to a computer screen):

Q. What is a Partitioned Matrix and what does it have to do with me? A. Ah, good question. Well, a Partitioned Matrix is a matrix that has been broken down into several smaller matrices But why tell you when I can show you a picture.

Lets say I have a 5x4 Matrix called G

And now a partitioned version (with the partition lines in red):

And now we name the individual parts (AKA: Blocks or Submatrices):

Now we can rewrite G as a 3x2 Matrix:

Now doesnt that look a lot nicer than our original G? Of course is does. Now, to address the second part of your question, this partitioned matrix can help us by speeding up a supercomputer or something like that. Isnt that exciting? Dont answer that. On to more questions

Q. Can I add Partitioned Matrices to each other? A. Sure, as long as the matrices being multiplied are identical in the way they are partitioned. Each block can be added to its corresponding block. Too easy. How bout another question?

Q. Can I multiply Partitioned Matrices by a scalar? A. Sure, as long you multiply the scalar one block at a time.

Come on, give me a harder one

Q. How can I multiply Partitioned Matrices by each other? A. Okay, now youve asked a tough one. The best way to explain this is through an example. In the following example uppercase Letters will represent blocks. First matrix J partitioned like so:

Or

And Matrix K partitioned like so:

Or

First of all, Partitioned Matrices can Only be multiplied because the number of vertical partitions of the first matrix in the equation is equal to the number of horizontal partitions of the second matrix in the equation. matrix:
(Row 1:) AE+BF (Row 2:) CE+DF

Now we expand each one of the Blocks.

Q. What Theorems can we get from this method of multiplying Partitioned Matrices ? A. Well youre a curious one arent you?
Well, in fact there is a theorem. Its called the Column-row Expansion Theorem and it basically means that because blocks work so well for multiplication then columns of the first matrix in a multiplication equation will correspond to rows of the second matrix in that equation to make an easier way to compute the equation. Heres the proof:

For each row index I and column index j, the (i,j)- entry in column n (A) row n (B) is the product of a(i,N) from column n (A) and b (N, j) from row n (B). Hence the (i, j) entry in the sum shown in (1) is:

a (i, 1)b(1,j)+ a (i, 2)b(2,j)++ a (i,N)b(N,j) This sum is also the (i, j)-entry in the AB, by the row-column rule.
Or you can just take my word for it.

The End
Produce by NYU Math Masters

You might also like