10
$\begingroup$

In linear algebra, a matrix $B$ is said to be "similar" to $A$ if $B=C^{-1}AC$, that is $B$ = a matrix $A$ multiplied by a third matrix $C$, and its inverse, $C^{-1}$.

In regular algebra, if I take a number $x$, and multiply it by $\frac{1}{2}$ and then $2$, the latter terms cancel out, and I get $x$, the same and not a "similar" variable. Wouldn't you also have this result in linear algebra? What am I missing?

5 Answers 5

0

I believe the matrices are called similar because they are in the same conjugacy class. Similarity is an equivalence relation and thus partitions the space of N x N matrices into separate equivalence classes. Those matrices which are similar are considered equivalent. Equivalent in this case means (as user1729 said here already) they are really the same linear transformation viewed from two different choices of basis.

  • 0
    Well, yes, similarity is an equivalence realtion, but *an sich* that doesn't really answer the question. There are many equivalence relations (=partitions) of $M_n(k)$, but not all of them are useful. Conjugacy is useful, because conjugate matrices are 'similar' in the sense that - as you and others have remarked - they represent the same linear map.2011-06-29
14

Matrix multiplication is not commutative in general. It corresponds to function composition, which is clearly not commutative in general.

  • 2
    @Zev, oh, you meant $n=1$.$I$see.2011-06-27
12

The following exercises are relevant to your question (all matrices are assumed to be $n\times n$ square matrices and $V$ is a vector space of dimension $n$):

Exercise 1: Let $A=\begin{bmatrix}1 & 2\\3 & 4\end{bmatrix}$ and let $B=\begin{bmatrix}0 & 1 \\0 & 0\end{bmatrix}$. Prove that:

(a) $AB\neq BA$ and

(b) $A^{-1}BA\neq B$.

Exercise 2: Prove that if $A$ and $B$ are arbitrary matrices and $B$ is invertible, then $AB=BA$ if and only if $B^{-1}AB=A$.

Exercise 3: Let $I$ denote the identity matrix. If $A$ is similar to $I$, then prove that $A=I$.

Exercise 4: Prove that the relation $\equiv$ on the set of all matrices defined by the rule $A\equiv B$ if and only if $A$ is similar to $B$ is an equivalence relation.

Exercise 5: Describe the matrices that constitute an equivalence class consisting of exactly one element. Does there exist an equivalence class consisting of exactly two elements? Prove or give a counterexample. More generally, if $n$ is a positive integer, does there exist an equivalence class consisting of exactly $n$ elements? Prove or give a counterexample. Finally, does there exist an equivalence class consisting of a countably infinite number of elements? Prove or give a counterexample.

Exercise 6: Prove that if $A$ is a matrix such that $AB=BA$ for all invertible matrices $B$, then $A=cI$ for some scalar $c$ where $I$ is the identity matrix. Deduce that if $A\neq cI$ for any scalar $c$, then there exists a matrix $B\neq A$ such that $B$ is similar to $A$. Hence similar matrices that are not equal exist in abundance.

Exercise 7: Let $A$ be a matrix and suppose that $A$ is similar to a diagonal matrix $B$ where all the diagonal entries of $B$ are equal. What can you deduce about $A$?

Exercise 8: If $A$ and $B$ are diagonal matrices, then prove that $AB=BA$. If $B$ is invertible, deduce that $B^{-1}AB=A$.

Exercise 9: If $B=C^{-1}AC$ for some invertible matrix $C$, then prove that $p(B)=C^{-1}p(A)C$.

Exercise 10: Let $A$ be a matrix and suppose that $A$ is similar to a diagonal matrix. Let $\lambda_1,\dots,\lambda_n$ be the diagonal entries of this diagonal matrix (repeated according to multiplicity). If $p$ is the polynomial defined by the rule $p(x)=(x-\lambda_1)\cdots (x-\lambda_n)$, then prove that $p(A)=0$.

Challenging Exercises:

Exercise A: Let $A$ be a matrix with complex entries. Prove that there is an invertible matrix $C$ with complex entries such that $C^{-1}AC$ is an upper triangular matrix.

Exercise B: A matrix $A$ with real or complex entries is said to be self-adjoint if $A$ is equal to its conjugate transpose. (The conjugate transpose of $A$ is the matrix obtained by taking the transpose of $A$ and then taking the complex conjugate of each of the entries of the transpose of $A$.) Prove the spectral theorem; that is, prove that there is a unitary matrix $U$ such that $U^{-1}AU$ is a diagonal matrix. (Let us recall that a matrix is unitary if its columns form an orthonormal tuple in $V$.)

Exercise C: A matrix $A$ is said to be normal if $AA^{*}=A^{*}A$ where $A^{*}$ denotes the conjugate transpose of $A$. Prove the complex spectral theorem; that is, prove that if $A$ is a normal matrix with complex entries, then there exists a unitary matrix $U$ such that $U^{-1}AU$ is a diagonal matrix.

  • 0
    Dear Qiaochu, this is true; thanks for pointing it out! I have corrected ***Exercise 1***.2011-06-28
11

Two matrices are similar if they are the same linear transformation but looked at through a different basis. For example, if I take a linear transformation $T: \mathbb{R}^3\rightarrow \mathbb{R}^3$ and look at the images of, say, $(1, 0, 0)$, $(0, 1, 0)$ and $(0, 0, 1)$ then I will get a matrix, $M_1$. However, if I look at the images of another basis, say $(1, 2, 3)$, $(0, 1, 0)$ and $(0, 0, 1)$, I will get a different matrix, $M_2$.

Crucially, $M_1$ and $M_2$ are similar - one can be got to the other via conjugation. So, interestingly, they are the same linear transformation, but they are different matrices.

However, the problem is that matrix multiplication is not commutative in general. Take any two $2\times 2$ matrices $A$ and $B$ and, chances are, $AB\neq BA$.

3

As explained above, the key point lies in the "noncommutative" for the usual matrix product.

The Hadmard (or entrywise) product, which is also very useful in matrix theory, gives you an example that similar matrices are the same. In that case, the invertible matrix is a matrix with no zero entry. The inverse matrix is entrywise the inverse of the original matrix.