Lecture 8
Recall
Matrix Algebra: addition, subtraction, scalar multiplication, and matrix multiplication.
Matrix multiplication can be thought of in several ways:
For to be defined, the number of columns of must equal the number of rows of . If and , then their product is defined and
It is important to note a few points:
-
1.
need not be even defined. Thus .
Consider for example
But is not defined.
-
2.
In some cases, both and are defined. However, it is still not the case that they are equal. Suppose we have two matrices, and . Then the dimensions of their products will be given by
since the dimensions do not match, the matrices cannot be equal.
-
3.
Finally, even if and are of the same size, they still may be unequal. Suppose we have two matrices, and . Both of their products will be of size , yet they may not be equal:
this is most easily determined by comparing the the top-left entry of both products: for the first matrix, and for the second.
We can see then that matrix multiplication is fundamentally a non-commutative operation. This is very different from the kind of algebra that we are familiar with. There are several other instances where matrix multiplication leads to counterintuitive results.
-
1.
Consider . We normally write for numbers, but note that for matrices,
As we have just found, need not be equal to , so we cannot write that .
-
2.
Consider also the normally valid identity
This identity need not hold for matrices.
-
3.
We regularly make use of the fact that if , then one of or (or both) are zero. This is a fundamental property of what are called integral domains. Does this hold for matrices?
The two matrices on the left are non-zero matrices, yet their product is the zero matrix. Thus does not imply that one of or are zero
Matrix Equations
We have discussed previously, and become comfortable with the notion that a system of linear equations can be expressed as a so-called augmented matrix:
Now however, we can also write this system as
This form should look very similar: . That is, this looks like a regular linear equation.
Special Types of Matrices
There are many special types of matrices.
Square Matrices:
-
1.
Diagonal matrix:
A diagonal matrix has non-zero entries only on its main diagonal.
-
2.
Upper triangular matrix
An upper triangular matrix has only zero entries below its main diagonal.
-
3.
Lower triangular matrix
A lower triangular matrix has only zero entries above its main diagonal.
-
4.
Identity matrix ()
The identity matrix is a diagonal matrix with โs on all of its diagonal entries.
The identity matrix is very important because multiplying any square matrix by the identity matrix will yield :
In this way, in matrices is like the number in the real numbers.
-
5.
Permutation matrix
A permutation matrix is formed by permuting (swapping) the rows of the identity matrix.
Remark: The set of all permutation matrices of a given size is called . For example:
Note that taking any two elements in and multiplying them together yields another element in . That is, is closed under multiplication. The name for this type of structure is a group. One feature is that . There are permutation matrices of a given size .
Transpose:
Transposition is an operation unique to matrices. Let be a matrix (not necessarily square). Then we define the transpose of , written as , by letting its columns be the rows of .
For example,
Some properties of the transpose include:
-
1.
-
2.
-
3.
-
4.
Symmetric Matrices
Transposition allows us to define two important types of matrices:
-
โข
is symmetric if .
-
โข
is skew-symmetric if .
Examining these definitions, it is easy to see that a symmetric or skew-symmetric matrix must be an matrix, even though transposition is defined for all matrices.
Let us examine some features of symmetric matrices. Consider with an matrix. Is symmetric?
Therefore is symmetric.
Consider . Is symmetric / skew-symmetric?
Thus is skew-symmetric.
It is natural to wonder what types of matrices are symmetric or skew-symmetric. Consider for example:
Note that and . Thus is neither symmetric nor skew-symmetric.
An important observation can be made by asking: what kinds of matrices can in general be skew-symmetric?
For to be skew-symmetric,
Thus and . This can only be true when . This can be generalized to any size :
Proposition: If is a skew-symmetric matrix then the diagonal entries must be .
Here is a question to consider. If you recall, for any function , it is possible to write as the sum of an even and an odd function. Is a similar result true for matrices?
If it were true that this is the case, then we would want
where is symmetric and is skew-symmetric. Then
Adding and subtracting these two equations, we obtain the two equations
What is the use in this result? Apart from it being interesting that this type of expression of an arbitrary matrix is always possible, the advantage of this decomposition is that if, perhaps, we can gain insight into features of symmetric and skew-symmetric matrices, then we can apply those understandings to other matrices using the decomposition.