Lecture 8

A. Agarwal
February 14, 2012

Recall

Matrix Algebra: addition, subtraction, scalar multiplication, and matrix multiplication.

Matrix multiplication can be thought of in several ways:

๐€๐ =[โ†ithโ†’]โข[โ†‘jthโ†“]
=Linear combination of columns ofย ๐€
=Linear combination of rows ofย ๐

For ๐€๐ to be defined, the number of columns of ๐€ must equal the number of rows of ๐. If ๐€mร—n and ๐nร—p, then their product ๐‚=๐€๐ is defined and

๐‚=(๐€๐)mร—p

It is important to note a few points:

  1. 1.

    ๐๐€ need not be even defined. Thus ๐€๐โ‰ ๐๐€.

    Consider for example

    ๐€3ร—4โ€ƒโ€ƒ๐4ร—5โ€ƒโ€ƒ(๐€๐)3ร—5

    But ๐๐€ is not defined.

  2. 2.

    In some cases, both ๐€๐ and ๐๐€ are defined. However, it is still not the case that they are equal. Suppose we have two matrices, ๐€3ร—2 and ๐2ร—3. Then the dimensions of their products will be given by

    (๐€๐)3ร—3โ‰ (๐๐€)2ร—2

    since the dimensions do not match, the matrices cannot be equal.

  3. 3.

    Finally, even if ๐€๐ and ๐๐€ are of the same size, they still may be unequal. Suppose we have two matrices, ๐€2ร—2 and ๐2ร—2. Both of their products will be of size 2ร—2, yet they may not be equal:

    (1234)โข(0712)โ‰ (0712)โข(1234)

    this is most easily determined by comparing the the top-left entry of both products: 2 for the first matrix, and 21 for the second.

We can see then that matrix multiplication is fundamentally a non-commutative operation. This is very different from the kind of algebra that we are familiar with. There are several other instances where matrix multiplication leads to counterintuitive results.

  1. 1.

    Consider (A+B)2. We normally write (A+B)2=A2+2โขAโขB+B2 for numbers, but note that for matrices,

    (๐€+๐)2 =(๐€+๐)โข(๐€+๐)
    =๐€โข(๐€+๐)+๐โข(๐€+๐)
    =๐€๐€+๐€๐+๐๐€+๐๐
    =๐€2+๐€๐+๐๐€+๐2

    As we have just found, ๐€๐ need not be equal to ๐๐€, so we cannot write that ๐€๐+๐๐€=2โข๐€๐.

  2. 2.

    Consider also the normally valid identity

    A2-B2 =(A-B)โข(A+B)

    This identity need not hold for matrices.

  3. 3.

    We regularly make use of the fact that if AโขB=0, then one of A or B (or both) are zero. This is a fundamental property of what are called integral domains. Does this hold for matrices?

    [1000]โข[0002]=[0000]

    The two matrices on the left are non-zero matrices, yet their product is the zero matrix. Thus ๐€๐=0 does not imply that one of ๐€ or ๐ are zero

Matrix Equations

We have discussed previously, and become comfortable with the notion that a system of linear equations can be expressed as a so-called augmented matrix:

x+3โขy=52โขx-3โขy=7}โ‡”[1352-37]

Now however, we can also write this system as

x+3โขy=52โขx-3โขy=7}โŸน[132-3][xy]=[57]

This form should look very similar: ๐€โขxโ†’=bโ†’. That is, this looks like a regular linear equation.

Special Types of Matrices

There are many special types of matrices.

Square Matrices: (nร—n)

  1. 1.

    Diagonal matrix:

    A diagonal matrix has non-zero entries only on its main diagonal.

    [d10โ‹ฏ00d2โ‹ฏ0โ‹ฎโ‹ฎโ‹ฑโ‹ฎ00โ‹ฏdn] ai,j=0โขย ifย iโ‰ j
  2. 2.

    Upper triangular matrix

    An upper triangular matrix has only zero entries below its main diagonal.

    [u1,1u1,2โ‹ฏu1,n0u2,2โ‹ฏu2,nโ‹ฎโ‹ฎโ‹ฑโ‹ฎ00โ‹ฏun,n] ui,j=0โขย ifย i<j
  3. 3.

    Lower triangular matrix

    A lower triangular matrix has only zero entries above its main diagonal.

    [l1,10โ‹ฏ0l2,1l2,2โ‹ฏ0โ‹ฎโ‹ฎโ‹ฑโ‹ฎln,1ln,2โ‹ฏln,n] ui,j=0โขย ifย i>j
  4. 4.

    Identity matrix (๐ˆn)

    The identity matrix is a diagonal matrix with 1โ€™s on all of its diagonal entries.

    ๐ˆn=[10โ‹ฏ001โ‹ฏ0โ‹ฎโ‹ฎโ‹ฑโ‹ฎ00โ‹ฏ1]

    The identity matrix is very important because multiplying any square nร—n matrix ๐€ by the identity matrix ๐ˆn will yield ๐€:

    ๐€๐ˆn=๐ˆnโข๐€=๐€

    In this way, ๐ˆn in nร—n matrices is like the number ๐Ÿ in the real numbers.

  5. 5.

    Permutation matrix

    A permutation matrix is formed by permuting (swapping) the rows of the identity matrix.

    [001100010]โข[xyz]=[zxy]

    Remark: The set of all permutation matrices of a given size n is called Sn. For example:

    S3={๐€3ร—3|๐€ย is a permutation matrix}

    Note that taking any two elements in S3 and multiplying them together yields another element in S3. That is, S3 is closed under multiplication. The name for this type of structure is a group. One feature is that |S3|=3!=6. There are n! permutation matrices of a given size n.

Transpose: ๐€T

Transposition is an operation unique to matrices. Let ๐€mร—n be a matrix (not necessarily square). Then we define the transpose of ๐€, written as ๐€T, by letting its columns be the rows of ๐€.

๐€mร—nโ‡’๐€nร—mT

For example,

๐€ =[123456] ๐€T =[142536]

Some properties of the transpose include:

  1. 1.

    (๐€+๐)T=๐€T+๐T

  2. 2.

    (kโข๐€)T=kโข๐€T

  3. 3.

    (๐€๐)T=๐Tโข๐€T

  4. 4.

    (๐€T)T=๐€

Symmetric Matrices

Transposition allows us to define two important types of matrices:

  • โ€ข

    ๐€ is symmetric if ๐€T=๐€.

  • โ€ข

    ๐€ is skew-symmetric if ๐€T=-๐€.

Examining these definitions, it is easy to see that a symmetric or skew-symmetric matrix must be an nร—n matrix, even though transposition is defined for all matrices.

Let us examine some features of symmetric matrices. Consider ๐=๐€+๐€T with ๐€ an nร—n matrix. Is ๐ symmetric?

๐ =(๐€+๐€T)T
=๐€T+(๐€T)T
=๐€T+๐€
=๐

Therefore ๐ is symmetric.

Consider ๐‚=๐€-๐€T. Is ๐‚ symmetric / skew-symmetric?

๐‚T =(๐€-๐€T)T
=๐€T-(๐€T)T
=๐€T-๐€
=-๐‚

Thus ๐‚ is skew-symmetric.

It is natural to wonder what types of matrices are symmetric or skew-symmetric. Consider for example:

๐€ =[1234] ๐€T =[1324]

Note that ๐€Tโ‰ ๐€ and ๐€Tโ‰ -๐€. Thus ๐€ is neither symmetric nor skew-symmetric.

An important observation can be made by asking: what kinds of matrices can in general be skew-symmetric?

๐€ =[abcd] ๐€T =[acbd]

For ๐€ to be skew-symmetric,

๐€T =-๐€
[acbd] =[-a-b-c-d]

Thus a=-a and d=-d. This can only be true when a=d=0. This can be generalized to any size n:

Proposition: If ๐€ is a skew-symmetric matrix then the diagonal entries must be 0.

Here is a question to consider. If you recall, for any function fโข(x), it is possible to write fโข(x) as the sum of an even and an odd function. Is a similar result true for matrices?

fโข(x) =even function+odd function
๐€nร—n =symmetric matrix+skew-symmetric matrix

If it were true that this is the case, then we would want

๐€ =๐+๐‚

where ๐ is symmetric and ๐‚ is skew-symmetric. Then

๐€T =๐T+๐‚T=๐-๐‚

Adding and subtracting these two equations, we obtain the two equations

๐€+๐€T2 =๐ ๐€-๐€T2 =๐‚

What is the use in this result? Apart from it being interesting that this type of expression of an arbitrary matrix is always possible, the advantage of this decomposition is that if, perhaps, we can gain insight into features of symmetric and skew-symmetric matrices, then we can apply those understandings to other matrices using the decomposition.