1
$\begingroup$

Let be $\varphi:\mathbb C^{2\times 2}\to\mathbb C$ with the following properties: $$

  1. It is linear on the columns: \left\{\begin{align} \varphi\left(\left[\begin{matrix}a_1+a_2&b\\c_1+c_2&d\\\end{matrix}\right]\right) = \varphi\left(\left[\begin{matrix}a_1&b\\c_1&d\\\end{matrix}\right]\right) + \varphi\left(\left[\begin{matrix}a_2&b\\c_2&d\\\end{matrix}\right]\right)\\ \varphi\left(\left[\begin{matrix}a&b_1+b_2\\c&d_1+d_2\\\end{matrix}\right]\right) = \varphi\left(\left[\begin{matrix}a&b_1\\c&d_1\\\end{matrix}\right]\right) + \varphi\left(\left[\begin{matrix}a&b_2\\c&d_2\\\end{matrix}\right]\right)\\ \qquad\varphi\left(\left[\begin{matrix}ka&b\\kc&d\\\end{matrix}\right]\right) = \varphi\left(\left[\begin{matrix}a&kb\\c&kd\\\end{matrix}\right]\right) = k\varphi\left(\left[\begin{matrix}a&b\\c&d\\\end{matrix}\right]\right)\\ \end{align}\right.\qquad.

  2. It is anti-symmetric on the columns: \varphi\left(\left[\begin{matrix}b&a\\d&c\\\end{matrix}\right]\right) = -\,\varphi\left(\left[\begin{matrix}a&b\\c&d\\\end{matrix}\right]\right)\quad.

  3. It maps identity to identity: \varphi\left(\left[\begin{matrix}1&0\\0&1\\\end{matrix}\right]\right)=1\quad.

$ I was told that, given these properties, \varphi\small\left(\left[\begin{matrix}a&b\\c&d\\\end{matrix}\right]\right)$ has to be $ad-bc\,$, good old $\det\small\left[\begin{matrix}a&b\\c&d\\\end{matrix}\right]$, and that applies to any $\mathbb C^{n\times n}$. Is this true?

2 Answers 2

1

Yes. You can see this by hand in the two-by-two case, though I think you kind of have to know what you're looking for to stumble across an argument.

First of all, convince yourself that $\varphi\left(\pmatrix{ka & la \\ kb & lb}\right) = 0$ for any $k, l, a, b$. That's because you can pull out the $kl$, and then $\varphi\left(\pmatrix{a & a \\ b & b}\right) = 0$ because switching the two columns on one hand negates the value of $\varphi$ and on the other hand leaves the matrix unchanged. This means, in particular, that if $ad - bc = 0$, then $\varphi\left(\pmatrix{a & b \\ c & d}\right) = 0$ as well.

Now, let $X = \varphi\left(\pmatrix{a & b \\ c & d}\right)$. We want use your properties to show that $X = ad - bc$ if $ad - bc \neq 0 $ as well. We have

$\begin{align*} (ad-bc) X &= da X + 0 - (-c)(-b) X + 0 \\ & = \varphi\left(\pmatrix{da & ab \\ dc & ad}\right) + \varphi\left(\pmatrix{ (-c)b & ab \\ (-c)d & ad}\right) \\& \qquad \qquad + \varphi\left(\pmatrix{(-c)b & (-b)a \\ (-c)d & (-b)c}\right) + \varphi\left(\pmatrix{da & (-b)a \\ dc & (-b)c}\right) \\ & = \varphi\left(\pmatrix{ad - bc & ab \\ 0 & ad}\right) + \varphi\left(\pmatrix{ad - bc & -ab \\ 0 & -bc}\right) \\ & = \varphi\left(\pmatrix{ad - bc & 0 \\ 0 & ad - bc}\right) \\ & = (ad - bc)^2. \end{align*}$

So if $ad - bc \neq 0$, then $X = ad - bc$, as desired.

In general you can play the same game. Your first two properties tell you how $\varphi$ changes under column operations -- i.e., under multiplication by another matrix on the right. But multiplying by the adjugate matrix on the right gives you the determinant times the identity, whence you can deduce that $\varphi$ of a matrix is just its determinant.

2

Yes, and if you're only interested in the two-by-two case then you can quite easily see it by expanding your function $\phi$ in terms of these properties. What I mean is, using only the first two properties, you should be able to reduce $\varphi\left(\left[\begin{matrix}a&b\\c&d\\\end{matrix}\right]\right)$ to a multiple of $\varphi\left(\left[\begin{matrix}1&0\\0&1\\\end{matrix}\right]\right)$, and then using property three you get the result.

I hope that was clear enough.