3
$\begingroup$

Let $K$ be a field, suppose that $D\colon M_{n\times n}(K) \to K$ is a function such that $D(AB)=D(A)\cdot D(B)$ and $D(I) \neq D(0)$, where $0$ is the zero matrix. Show that if $\operatorname{rank}(A) < n$, then $D(A)=0$.

My consideration is that: first by $D(0)=D(0)D(I)$ and $D(0)\neq D(I)$, I can show $D(0)=0$ and $D(I)=1$. then I want to show $D(I_k)=0$, where $I_k $ is $n\times n$ diagonal matrix with k diagonal entries equal to $1$ and others $0$. Then $D(A)=D(P^{-1}I_kP)=0$. However, I fail to prove $D(I_k)=0$.

Any suggestions?

Thanks a lot

  • 0
    Writing $D(A)=D(P^{-1}I_kP^)$ suggests that you write $A=P^{-1}I_kP$, hence in particular $A$ is diagonalizable, which doesn't need to be true (for example, A=\pmatrix{1&1\\\ 0&1}). Maybe you used an other argument that you should specify.2012-04-28

3 Answers 3

2

Note that since $I_k^2=I_k$, then $D(I_k)$ is either $1$ or $0$. Note also that it is enough to show that $D(I_{n-1})=0$, as $D(I_k)=D(I_kI_{n-1})=D(I_k)D(I_{n-1})$.

So suppose that $D(I_{n-1})=1$. For convenience I will write $I_{n-1}=\sum_{j=1}^{n-1}E_{jj}$, where $\{E_{kj}\}$ are the canonical matrix units. Now we can obtain $\sum_{j=2}^nE_{jj}$ by permutations, so $D(\sum_{j=2}^nE_{jj})=1$. But then $ D(\sum_{j=2}^{n-1}E_{jj})=D(\sum_{j=1}^{n-1}E_{jj}\sum_{j=2}^{n}E_{jj}) =D(\sum_{j=1}^{n-1}E_{jj})D(\sum_{j=2}^{n}E_{jj})=1. $ As $\sum_{j=2}^{n-1}E_{jj}$ is similar to $I_{n-2}$, we conclude that $D(I_{n-2})=1$. By repeating the argument, we get in the end that $D(E_{jj})=1$ for all $j$. But then $ 0=D(0)=D(I_{n-1}E_{nn})=D(I_{n-1})D(E_{nn})=1, $ a contradiction. So $D(I_{n-1})=0$, and we are done.

1

I think there is a theorem (at least for real matrices) which says that for any matrix $A$ there exist invertible matrices $P,Q$ such that $I_k=PAQ$ (where $P,Q$ come from elementary operations on rows and columns)

If a matrix is invertible, then $D(PP^{-1})=D(I)=1$, so $D(P) \neq 0$. So indeed you can reduce the problem to proving that $D(I_k)=0$.

First $D(I_1)=0$ since there exist matrices $P,Q$ invertible such that $PI_1Q$ has only the element on the position $2,2$ equal to $1$ and the rest zero. If $D(I_1)=1$ then $D(PI_1Q)=1$, and $D(0)=D(I_1 \cdot (PI_1Q))=1$. Contradiction.

Suppose now that $D(I_k)\neq 0$. Then $D(I_k)=1$ since $D(I_k)=(D(I_k))^2$. If $k you can pick $P,Q$ invertible such that $PI_kQ$ has the diagonal elements on positions $2,..,k+1$ equal to $1$ and the rest $0$. Then $D(PI_kQ)=1$ and the product $I_k \cdot PI_kQ$ has only $k-1$ diagonal entries equal to 1 and the value of $D$ on this matrix is $1$. Therefore you can conclude that $D(I_{k-1})=1$. Inductively you reach $D(I_1)=1$ which is a contradiction.

It's not a very pretty solution, but I guess it works.

0

I will use the result mentioned by Beni Bogossel: for a matrix $A$ of rank $r$ we can find invertible matrices $P$ and $Q$ such that $A=PA'Q$, where $A'$ is a diagonal matrix, with $r$ $1$ and $n-r$ zeros. For $S\subset\{1,\ldots,n\}$, we denote $M_S$ the diagonal matrix, with $M_S(i,i)=1$ if $i\in S$ and $0$ otherwise. If $S_1$ and $S_2$ are two subsets of $\{1,\ldots,n\}$ which have the same cardinality, the matrices $M_{S_1}$ and $M_{S_2}$ are similar, and $D(M_{S_1})=D(M_{S_2})$ because $D(M_S)\in\{0,1\}$ for all $S$ and $D(A)\neq 0$ if $A$ is invertible. So if $|S|=1$, then $D(M_S)=0$, because (if we assumed $n\geq 2$) we can find $S'$ disjoint of $S$ with cardinal $1$ and $0=D(M_{S\cap S'})=D(M_S)D(M_{S'})=D(M_S)^2$. Now let $S\subsetneq\{1,\ldots,n\}$. We can find $x\notin S$ so $D(M_S)=D(M_SM_{\{x\}})=D(M_{S\cap \{x\}})=D(M_S)D(M_{\{x\}})=0$ and we are done (without induction).