Matrix
From Math Images
(→Matrix Addition) 
(→Matrix Multiplication) 

Line 187:  Line 187:  
===Matrix Multiplication===  ===Matrix Multiplication===  
  If matrix ''A''  +  If matrix ''A'' located on the left has the same number of columns as the number of rows in matrix ''B'' located on the right, then we can multiply ''A'' by ''B.'' To be mathematically precise, if ''A'' is <math> m \times n</math> and ''B'' is <math>n \times p</math>, then the matrix product ''AB'' exists and is an <math>m \times p</math> matrix. This matrix product is defined by each of its entries, <math>(AB)_{ij} </math>, which is the [[Dot Product]] of the i<sup>th</sup> row of ''A'' and the j<sup>th</sup> column of ''B''. Because of the requirement on the dimensions of the matrices, these two vectors have the same size, so the dot products make sense. <br\> 
The following is an example of matrix multiplication:  The following is an example of matrix multiplication:  
Line 236:  Line 236:  
</math>  </math>  
  We get  +  We get product ''AB'' equal to: 
::<math>=\begin{bmatrix}  ::<math>=\begin{bmatrix}  
A_{1} \cdot B_{1} & A_{1} \cdot B_{2} \\  A_{1} \cdot B_{1} & A_{1} \cdot B_{2} \\  
Line 244:  Line 244:  
  But numbers make  +  But numbers make more sense, so let’s consider two more specific matrices.<br\> 
::<math>A = \begin{bmatrix}  ::<math>A = \begin{bmatrix}  
1& 3 & 4\\  1& 3 & 4\\  
Line 285:  Line 285:  
Matrix multiplication has one other interesting property: it is not commutative. Generally, ''AB'' ≠ ''BA'' for matrices. You can quickly see this for the matrices ''A'' and ''B'' above: while ''AB'' is a 2x2 matrix, ''BA'' is a 3x3 matrix.  Matrix multiplication has one other interesting property: it is not commutative. Generally, ''AB'' ≠ ''BA'' for matrices. You can quickly see this for the matrices ''A'' and ''B'' above: while ''AB'' is a 2x2 matrix, ''BA'' is a 3x3 matrix.  
====Properties of Matrix Muliplication====  ====Properties of Matrix Muliplication====  
  Let ''A'' be <math>m \times n </math> and ''B'' be <math> n \times p</math> Then the product ''AB'' is  +  Let ''A'' be <math>m \times n </math> and ''B'' be <math> n \times p</math> Then the product ''AB'' is an <math>m \times p </math> matrix where: 
:::<math> (AB)_{ij}= \sum_{k=1}^n{A_{ik}B_{kj}}  :::<math> (AB)_{ij}= \sum_{k=1}^n{A_{ik}B_{kj}}  
</math>  </math>  
The above summation formula is a reasonable representation of matrix multiplication because the sum is the dot product of <math>A_{i} </math> and <math>B_{j} </math> for any matrices ''A'' and ''B'', if both are of multiplicable size.  The above summation formula is a reasonable representation of matrix multiplication because the sum is the dot product of <math>A_{i} </math> and <math>B_{j} </math> for any matrices ''A'' and ''B'', if both are of multiplicable size.  
+  
+  For matrices A, ''m x n,'' B, ''n x p,'' and C, ''p x q:''  
#'''Matrix Multiplication is <balloon title="Products will remain equivalent no matter what way they are grouped">Associative</balloon>''':<math>(AB)C=A(BC)</math>  #'''Matrix Multiplication is <balloon title="Products will remain equivalent no matter what way they are grouped">Associative</balloon>''':<math>(AB)C=A(BC)</math>  
#'''Matrix Multiplication is <balloon title="For any matrices where A is ''m x n,'' B is ''n x p,'' and C is ''p x q,'' the matrix sum multiplied by a matrix is equivalent to the sum of the products where a matrix is multiplied with each component matrix of the matrix sum.">Distributive </balloon> over Addition ''':<math>A(B+C)=AB+AC</math>  #'''Matrix Multiplication is <balloon title="For any matrices where A is ''m x n,'' B is ''n x p,'' and C is ''p x q,'' the matrix sum multiplied by a matrix is equivalent to the sum of the products where a matrix is multiplied with each component matrix of the matrix sum.">Distributive </balloon> over Addition ''':<math>A(B+C)=AB+AC</math>  
Line 298:  Line 300:  
1. '''Proof that Matrix Multiplication is Associative:''' <math>(AB)C=A(BC)</math>  1. '''Proof that Matrix Multiplication is Associative:''' <math>(AB)C=A(BC)</math>  
  By the definition of matrix equality, we want to show  +  By the definition of matrix equality, we want to show that for multiplicable matrices: A, "m x n", B "n x p", and C "p x q", their matrix product can be grouped in any manner so that entries of one grouped matrix product equal the corresponding entries of another differently grouped matrix product. 
  :::::<math>  +  :::::<math>Claim: ((AB)C)_{ij}=(A(BC))_{ij} </math> 
  Working from the left hand side  +  Working from the left hand side of the proposition, we apply the definition of matrix multiplication to the matrices, (AB) and C: 
  :::::<math>  +  :::::<math>((AB)C)_{ij} = \sum_{k=1}^n(AB)_{ik}C_{kj} </math> 
  By the definition of matrix multiplication, we break down the matrices (AB) and C to  +  By the definition of matrix multiplication, we break down the matrices (AB) and C to the real number entries of matrices A, B, and C: 
  :::::<math>\sum_{k=1}^n(AB)_{ik}C_{kj} = \sum_{k=1}^n\left(\sum_{p=1}^  +  :::::<math>\sum_{k=1}^n(AB)_{ik}C_{kj} = \sum_{k=1}^n\left(\sum_{p=1}^lA_{ip}B_{pk}\right)C_{kj} </math> 
Because we have reduced the matrices down to their real number entries, properties of real numbers apply. Switching the order of summation by the commutative and associative properties of real numbers under multiplication:  Because we have reduced the matrices down to their real number entries, properties of real numbers apply. Switching the order of summation by the commutative and associative properties of real numbers under multiplication:  
  :::::<math>\sum_{k=1}^n\left(\sum_{p=1}^  +  :::::<math>\sum_{k=1}^n\left(\sum_{p=1}^lA_{ip}B_{pk}\right)C_{kj} =\sum_{p=1}^lA_{ip}\left(\sum_{k=1}^nB_{pk}C_{kj}\right) </math> 
By the definition of matrix multiplication applied to the matrix BC:  By the definition of matrix multiplication applied to the matrix BC:  
  :::::<math>\sum_{p=1}^  +  :::::<math>\sum_{p=1}^lA_{ip}\left(\sum_{k=1}^nB_{pk}C_{kj}\right) = \sum_{p=1}^lA_{ip}(BC)_{pj} </math> 
  By the definition of matrix multiplication, we group the matrix BC with the constant <math>  +  By the definition of matrix multiplication, we group the matrix BC with the constant <math>A_{ip} </math>, to form the composite matrix product: 
  :::::<math>\sum_{p=1}^  +  :::::<math>\sum_{p=1}^lA_{ip}(BC)_{pj} =(A(BC))_{ij} </math> 
Thus:  Thus:  
  :::::<math>  +  :::::<math> ((AB)C)_{ij}=(A(BC))_{ij}</math> As Desired; matrix multiplication is associative. 
Line 330:  Line 332:  
2. '''Proof that Matrix Multiplication Over Addition is Distributive:''' <math>A(B+C)=AB+AC </math>  2. '''Proof that Matrix Multiplication Over Addition is Distributive:''' <math>A(B+C)=AB+AC </math>  
  By the definition of matrix equality, we want to show  +  By the definition of matrix equality, we want to show that for any matrices where A is ''m x n,'' B is ''n x p,'' and C is ''p x q,'' the entries of the matrix sum multiplied by a matrix are equivalent to the corresponding entries for the sum of the products where a matrix is multiplied with each component matrix of the matrix sum 
  :::::<math>  +  :::::<math>Claim: (A(B+C))_{ij} = (AB+AC)_{ij} </math> 
  Working from the left hand side  +  Working from the left hand side of the proposition, we apply the definition of matrix multiplication to the composite matrix (A(B+C)): 
  :::::<math>  +  :::::<math>(A(B+C))_{ij}=\sum_{k=1}^nA_{ik}(B+C)_{kj}</math> 
  By definition of matrix addition, we break down the composite matrix  +  By definition of matrix addition, we break down the composite matrix (A(B+C)) down to the real number entries of matrices A, B, and C: 
  :::::<math>\sum_{k=1}^nA_{ik}(B+C)_{kj}=\sum_{k=1}^  +  :::::<math>\sum_{k=1}^nA_{ik}(B+C)_{kj}=\sum_{k=1}^nA_{ik}(B_{kj}+C_{kj}) </math> 
  Because we are now dealing with real number entries, the properties of real numbers apply and by the distributive property applied to the entries of matrix  +  Because we are now dealing with real number entries, the properties of real numbers apply and by the distributive property applied to the entries of matrix (A(B+C)): 
  :::::<math>\sum_{k=1}^  +  :::::<math>\sum_{k=1}^nA_{ik}(B_{kj}+C_{kj})=\sum_{k=1}^n(A_{ik}B_{kj} + A_{ik}C_{kj}) </math> 
  We have reduced the left hand side  +  We have reduced the left hand side of the proposition as much as possible. 
  Now, we will be working on the right hand side  +  Now, we will be working on the right hand side of the proposition. By the definition of matrix addition, we break down the composite matrix (AB+AC) into the separate matrices (AB) and (AC): 
  :::::<math>  +  :::::<math>(AB+AC)_{ij}= (AB)_{ij} + (AC)_{ij} </math> 
  By the definition of matrix multiplication, we break down the matrices (AB) and (AC) to  +  By the definition of matrix multiplication, we break down the matrices (AB) and (AC) to the real number entries of matrices A, B, and C: 
  :::::<math>(AB)_{ij} + (AC)_{ij} =\sum_{k=1}^  +  :::::<math>(AB)_{ij} + (AC)_{ij} =\sum_{k=1}^nA_{ik}B_{kj} + \sum_{k=1}^nA_{ik}C_{kj} </math> 
  Now that we are dealing with real number entries, properties of real numbers apply. So by the associativity and commutativity of real numbers applied to the  +  Now that we are dealing with real number entries, properties of real numbers apply. So by the associativity and commutativity of real numbers applied to the entries from matrices (AB) and (AC), we obtain: 
  :::::<math>\sum_{k=1}^  +  :::::<math>\sum_{k=1}^nA_{ik}B_{kj} + \sum_{k=1}^nA_{ik}C_{kj}=\sum_{k=1}^n(A_{ik}B_{kj} + A_{ik}C_{kj}) </math> 
  Finally  +  Finally we have shown that the left hand side equals the right hand side of our original claim. Thus, matrix multiplication over addition is distributive. 
  +  
}}  }}  
Revision as of 14:31, 26 July 2013
This is a Helper Page for:


Blue Fern 
Summation Notation 
Change of Coordinate Systems 
Math for Computer Graphics and Computer Vision 
A matrix is a rectangular array of numbers that can be used to store numbers for later access. In this helper page, we will discuss the mathematical properties of matrices.
Contents 
Size
A matrix is typically described in terms of its size. If a Matrix M has m rows and n columns, we say that M has size m x n, or more simply, we say M is m x n. The numbers m and n are sometimes called the dimensions of M.
Matrix Notation
Matrices and their entries are closely related. To avoid confusion, separate notation exists for both matrices and entries.
Before we begin, we must introduce the general notation used to indicate a location of an entry in a matrix. For a matrix A, indicates the entry located in the ith row and jth column. Below is matrix A with entries whose row and column locations are explicitly shown.
1.Notation for a Matrix:
 In other words, a matrix can be defined by just a capital letter or by the genral entry inside brackets.
 A matrix is the same thing as its general entry enclosed in brackets.
 In other words, a matrix can be defined by just a capital letter or by the genral entry inside brackets.
2.Notation for an Entry:
 In other words, an entry, as typically represented as a lowercase letter with a subscript, can also be written by enclosing the name of the matrix with parenthesis. This convention applies even to matrices such as C(BA) which has the typical entry denoted as
 and are two ways of saying the same thing.
 In other words, an entry, as typically represented as a lowercase letter with a subscript, can also be written by enclosing the name of the matrix with parenthesis. This convention applies even to matrices such as C(BA) which has the typical entry denoted as
Matrix Equality
In order to prove identities about matrices, we need to first define what it means for two matrices to be equal!
Definition: Two matrices A and B are equal and we write A=B, if they are the same size and for all i rows and j columns.
In other words, matrices are equal if they are of the same size and have the same corresponding entries.
In proving identities and properties of matrices, this definition comes in handy because the proofs rely on proving that the general entry of one matrix equals to the general entry of the other (), which would therefore mean that both matrices is equal if their sizes are equal.
Matrix Operations
There are three fundamental matrix operations: addition, scalar multiplication, and matrix multiplication. Properties for each operation are given below along with proofs.
Matrix Addition
For matrix addition, the sum of two matrices and must have the same size in which case their matrix sum is defined as . That is, each entry in the matrix is the sum of the corresponding entries in the separate matrices. For instance:
Properties of Matrix Addition
 Addition is Commutative:
 Addition is Associative:
 Identity of Addition:
Proofs for properties of Matrix Addition
Scalar Multiplication
Definition: For a matrix and a scalar value k, the scalar product kA is defined by , where kA is also .
The Entry definition is defined as:
For example:
Properties of Scalar Multiplication
 , where and are real numbers.
Proofs of Scalar Multiplication Properties
Matrix Multiplication
If matrix A located on the left has the same number of columns as the number of rows in matrix B located on the right, then we can multiply A by B. To be mathematically precise, if A is and B is , then the matrix product AB exists and is an matrix. This matrix product is defined by each of its entries, , which is the Dot Product of the i^{th} row of A and the j^{th} column of B. Because of the requirement on the dimensions of the matrices, these two vectors have the same size, so the dot products make sense.
The following is an example of matrix multiplication:
Now A has size 2x3 and B has size 3x2, so the product AB will have size 2x2. The entries of AB are the dot products of the two rows of A and the two columns of B. Slicing the A matrix in terms of rows and the B matrix in terms of columns like so:
We get product AB equal to:
But numbers make more sense, so let’s consider two more specific matrices.
so the final product is
This animation illustrates the process of matrix multiplication:
There are two special matrices that may appear when making basic operations. The zero matrix is a square matrix whose entries are all zero. If we call this Z, then clearly Z+A = A+Z = A for all matrices A for which this makes sense, so the zero matrix behaves like the number 0 for addition. The identity matrix is a square matrix that has a_{ii}=1, but a_{ij}=0 if i≠j. That is, the identity matrix for the case is:
Matrix multiplication has one other interesting property: it is not commutative. Generally, AB ≠ BA for matrices. You can quickly see this for the matrices A and B above: while AB is a 2x2 matrix, BA is a 3x3 matrix.
Properties of Matrix Muliplication
Let A be and B be Then the product AB is an matrix where:
The above summation formula is a reasonable representation of matrix multiplication because the sum is the dot product of and for any matrices A and B, if both are of multiplicable size.
For matrices A, m x n, B, n x p, and C, p x q:
 Matrix Multiplication is Associative:
 Matrix Multiplication is Distributive over Addition :
 Identity for Multiplication:
Proofs of Matrix Multiplication Properties
Matrix Transposition
Another matrix operation you might see frequently is transposition. Matrix transposition replaces each row of a matrix with the corresponding column of the same matrix. The transpose is written as A^{T} and is defined by . In other words, if A is m x n then A^{T} is n x m; A doesn't necessarily have to be a square matrix in order for it to have a transpose. To make all this a little clearer, let's look at the example for a nonsquare matrix A:
If we take the transpose of the above , we see that:
We are back to the original matrix, A, which we started with. This leads us to one of the properties of Matrix Transposition, defined formally under #1 in the Properties below. Think intuitively about the property #1 and how it is analogous to reflecting over the main diagonal through the three entries as shown in the picture below.
Properties of Matrix Transposition
 Selfinverse of the transpose
 Transposes preserve addition
 Transposes reverse the ordering of the matrices
 Transposes preserve scalar multiplication , where is a scalar multiple.
Proofs of Matrix Transposition Properties
Matrices Are Functions
One of the most common ways to work with matrices is that matrices can represent functions from one Cartesian space to another. We can see this if we think of a point as a vector, and then of a vector as a matrix. So we can write the vector as , and in this form it is a 3x1 matrix. Then we can multiply a 3x3 matrix by V and get another 3x1 matrix as a result. For example,
 if and , then
 if and , then
So the matrix A represents a function on 3D Cartesian space. Matrix multiplication has the properties we would expect a set of functions to have: they are associative, are linear (that is, if V = V_{1} + V_{2}, then AV = AV_{1} + AV_{2}), and has an identity I. In the later pages on transformations we will see exactly how we use matrices as functions in computer graphics. One such page is Math for Computer Graphics and Computer Vision.
References
This page was originally written by Steve Cunningham.
(image for three dimensional matrix)
Source for transpose figure: http://www.katjaas.nl/transpose/transpose.html
Additional Information
Another explanation of matrix operations: http://www.miislita.com/informationretrievaltutorial/matrixtutorial2matrixoperations.html
http://www.millersville.edu/~bikenaga/linearalgebra/matrixproperties/matrixproperties.html