0
$\begingroup$

Let $p$ be a prime number. Let $M$ denote the ring of $2 \times 2$ matrices over the field $F$ of $p$ elements. For $A \in M$, let $C(A)$ denote the set of those matrices $B \in M$ such that $AB = BA$. What are the possible values of the dimension of $C(A)$ over $F$ for $A \in M$?

What I know is this: Given a square $n \times n$ matrix $A$ over a field $F$, it is always true that $A$ commutes with any $B=p(A)$, where $p(x)$ is a polynomial with coefficients in $F$. Now, because of the theorem of Cayley-Hamilton, the matrix $A$ is root of its characteristic polynomial (which has degree $n$), so that means that every power of $A$ larger than $n-1$ can be written as a linear combination of powers up to $n-1$. For, our case, it means that: $C(A)=B=p_0I+p_1A$, where $p_0,p_1 \in F$.

So, that means $ 2\le \dim(B)\le 4$ (cases where $p_0=0$ and $p_1=0$). Is this correct? But what about $p$ prime - what it has to do with anything?

1 Answers 1

1

$\newcommand{\Set}[1]{\left\{ #1 \right\}}$The only cases that occur are $\dim(B) = 2, 4$.

You can start by looking at the roots of the characteristic polynomial $f$ of $A$ over $F$.

  • If the two roots are in $F$, and coincide, then there are two cases.

    • $\dim(C(A)) = 4$ if $A$ is a scalar matrix,
    • $\dim(C(A)) = 2$ if it is not. (I may expand if needed.)
  • If the two roots are in $F$, but are distinct, then it is not difficult to see that $\dim(C(A)) = 2$. (I may expand if needed.)

If the two roots are not in $F$, then $f$ is irreducible in $F[x]$, and this implies that $$ K = F[A] = \Set{ a_{0} I + a_{1} A : a_{i} \in F } \cong F[x]/(f) $$ is a field with $p^{2}$ elements.

Let $V$ be the underlying vector space. We have $\dim_{F}(V) = 2$, but $\dim_{K}(V) = 1$. Now note that $C(A) \supseteq K$ is nothing else that the ring of endomorphisms of the $1$-dimensional $K$-vector space $V$. As such, $C(A)$ has dimension $1$ over $K$, so that $C(A) = K$, and $C(A)$ has dimension $2$ over $F$.

I have written the latter part so that it generalizes easily to an $n \times n$ matrix $A$ whose characteristic polynomial $f$ is irreducible. Then $C(A) = \Set{ a_{0} I + a_{1} A + \dots + a_{n-1} A^{n-1} : a_{i} \in F} = F[A]$ is a field.

  • 0
    Thanks for the solution! Some questions: Why are we looking at the roots of the characteristic polynomial - it has something to do with theory or it just comes in handy in here? All the matrices that commute with $A$ are in the form $p_0 I+p_1A$ or there are other too? Also, when the two roots coincide, don't we have a charactersitic polynomial in the form of: $(\lambda - \lambda_0)^2=0$ and if we put $A$ as the solution we have: $A=\lambda I$, which means that $A$ can only be a scalar matrix? Can you please write the "expands"?2017-02-12
  • 0
    Also, I don't understand the isomorphism you use, which space is $V$ exactly and why $dim_F(V)=2$ and $dim_K(V)=1$ - can you elaborate?2017-02-12
  • 0
    Thanks. Will try and address your questions, but I'm unable to do it until tomorrow.2017-02-12
  • 0
    To your first batch of questions. It just comes handy. Yes, except when $A$ is scalar, $C(A)$ is what you name, but of course it has to be proved. When the two roots are the same, you could still have $A = \begin{bmatrix}\lambda & 1\\0 & \lambda\end{bmatrix}$. However, it is not difficult to show that if $A' = A - \lambda I = \begin{bmatrix}0 & 1\\0 & 0\end{bmatrix}$, then the elements of $C(A) = C(A')$ are of the form $p_{0} I + p_{1} A'$ or equivalently $q_{0} I + q_{1} A$.2017-02-13
  • 0
    To your second question. $V$ is the space of column vectors of length $2$ over $F$, on which $M$ naturally operates. It is a space of dimension $2$ over $F$. But over the bigger field $K$, it has dimension $1$. This is similar to the fact that the complex numbers have dimension $2$ as a vector space over the reals, but only dimension $1$ as a vector space over the complex numbers themselves.2017-02-13
  • 0
    In your first comment-answer, having $B=p_0I+p_1A'$ doesn't make $dim(C(A))=3$? Can you also provide the proof for the $dim(C(A))$ when the roots are distinct?2017-02-13
  • 0
    As to your first question, why? Just compute and you'll see. Second question: if $A = \begin{bmatrix}a & 0\\0 & b\end{bmatrix}$, with $a \ne b$, then any matrix that commutes with $A$ must preserve the eigenspaces (direct calculation) and thus be diagonal.2017-02-13
  • 0
    Isn't $B=\begin{pmatrix} p_0 & p_1 \\ 0 & p_0 \\ \end{pmatrix}$? Doesn't that require the three matrices: $\begin{pmatrix} 1 & 0 \\ 0 & 0 \\ \end{pmatrix}$, $\begin{pmatrix} 0 & 1 \\ 0 & 0 \\ \end{pmatrix}$, $\begin{pmatrix} 0 & 0 \\ 0 & 1 \\ \end{pmatrix}$, as building blocks (so dimension equals 3)? In the second answer, why $A$ has to be diagonal and not of the general form: $\begin{pmatrix} a & b \\ c & d \\ \end{pmatrix}$?2017-02-13
  • 0
    @JohnZobolas, until you try and do the calculations yourself, you won't be convinced. Please go ahead and compute.2017-02-13
  • 0
    The only way that I see that $dim(B)=2$ in the first case is if I take as basis the: $\{\begin{pmatrix} 1 & 0 \\ 0 & 1 \\ \end{pmatrix}, \begin{pmatrix} 0 & 1 \\ 0 & 0 \\ \end{pmatrix} \}$ but I don't understand in the case of matrices which base should I prefer - the one with the minimum dimension?2017-02-13
  • 0
    Regarding the calculations (if we mean the same ones): I took the general form of $A$ matrix - that being $\begin{pmatrix} a & b \\ c & d \\ \end{pmatrix}$ - put that into the characteristic equation, used the Caley-Hamilton theorem and then just got to an equation that is true (didn' t found that $b=c=0$ as you suggested). I am sure that I am missing something here based on your foundings...2017-02-13