5
$\begingroup$

Is there only one identity matrix $$\begin{pmatrix} 1&0&...&...&0\\0&1&0&...&0\\...&0&1&...&0\\...&...&0&1&0\\...&...&...&0&1\end{pmatrix}$$ etc.. Or are there different identity matrices for other bases?

A textbook example asks if $[T]_{\beta} = I$ (the $n\times n$ identity matrix) for some basis $\beta$, is $T$ the identity operator?

  • 0
    Do you know a formula for how matrices change when you change the basis?2012-10-11
  • 0
    Yes, you arrange the previous matrix as a multiple of whatever amount of each basis matrix, then change the basis and set it equal to the original matrix. The new coefficients will be the changed-basis matrix.2012-10-12
  • 0
    So far I'm the only person who's up-voted this question.2012-10-12
  • 0
    Imray, that's a *procedure*; I asked whether you know a *formula*. That is, if you have an operator $T$, and a basis $\alpha$, and another basis $\beta$, do you know a formula relating $[T]_{\alpha}$ and $[T]_{\beta}$?2012-10-12

2 Answers 2

4

Suppose we want $$ \begin{bmatrix} a & b \\ c & d \end{bmatrix} = \begin{bmatrix} p & q \\ r & s \end{bmatrix} \begin{bmatrix} a & b \\ c & d \end{bmatrix} $$ to be true regardless of which matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is, so that $\begin{bmatrix} p & q \\ r & s \end{bmatrix}$ is an identity matrix. Since it's true regardless of which matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is, it must be true in particular if $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$, so we have $$ \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} =\begin{bmatrix} p & q \\ r & s \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}. $$ This last equality clearly implies that $\begin{bmatrix} p & q \\ r & s \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$. Conclusion: if $\begin{bmatrix} p & q \\ r & s \end{bmatrix}$ is an identity matrix, then $\begin{bmatrix} p & q \\ r & s \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$. Therefore there is only one $2\times2$ identity matrix. And the same argument works for bigger matrices.

3

By definition, $[Tv]_\beta = [T]_\beta [v]_\beta = [v]_\beta$ for all $v$. Therefore $Tv=v$ for all $v$, i.e. $T$ is the identity on $V$. But to address your real question, here are some more questions for you:

  • What happens if I try to perform a "change of basis" on the identity matrix?
  • What happens if I try to represent the identity operator (on some $V$) in some basis? What does the identity operator do to the basis vectors?
  • What is the definition of the identity matrix, and what should this have to do with bases for vector spaces?
  • 0
    is $\beta$ the standard ordered basis in your answer, or any basis?2012-10-11
  • 0
    @Imray: $\beta$ is the basis in your textbook example. But if $T$ is the identity, then $[T]_\beta=I$ for any basis $\beta$.2012-10-11
  • 0
    You ask, $"What \ happens \ if \ I \ try \ to \ perform \ a \ "change \ of \ basis" \ on \ the \ identity \ matrix?"$ If I change the basis, i.e. from the standard one to something different, the identity matrix changes but the non-zero values are still only on the diagonal though. Is that what you mean? I think I'm getting it then... the identity matrix for a basis is simply a matrix with non-zeroes only on its diagonal and those values depend on the basis...2012-10-11
  • 1
    @Imray: Here's what I meant. Suppose $B$ is a change of basis matrix (i.e. an invertible matrix). Then the matrix for $I$ under the new basis is given by $B^{-1} I B$. But that is just $I$ again. Can you explain what you mean by the identity matrix "changing"?2012-10-11
  • 0
    Yes, I mean that if I have a $2 \times 2$ identity matrix, which means it was derived by multiplying 1 to the first and fourth component of the standard ordered $2 \times 2$ basis, and I change the matrices from standard basis to something funnier, the matrix identity changes.2012-10-12