6
$\begingroup$

When calculating determinants it can be nice to multiply a row by a number or to add one row to another (your basic row operations). Each has an easy to understand effect on the determinant. Today I ran across a type of row operation whose effect on the determinant is a lot less clear to me.

Take an $n \times n$ matrix $M$ and an $n \times n$ matrix $A$. While calculating the determinant of $M$, I might mess around with the rows of $M$. Every time I mess with the first row of $M$, I would like to be able to change my mind and mess with the second row of $M$ instead.

If I multiply the first row of $M$ by $2$ to get M', or if I multiply the second row of $M$ by $2$ to get $M″$, I get the same determinants, \det(M') = \det(M″). This does not depend on the matrix $M$; I always have that $2$ times the first row is equivalent to $2$ times the second row.

For instance, if $M = \left(\begin{smallmatrix}a& b \\ c& d \end{smallmatrix}\right)$ then M' = \left(\begin{smallmatrix}2a& 2b \\ c& d \end{smallmatrix}\right) and M'' = \left(\begin{smallmatrix}a& b \\ 2c& 2d \end{smallmatrix}\right) and \det(M') = \det(M″) = 2\det(M) = 2ad−2bc.

Can I do something similar with "$A$" instead of "$2$"?

If I multiply the first row of $M$ by $A$ to get M', then what matrix $B$ do I need to multiply the second row of $M$ by to get M'' so that \det(M')=\det(M'') have the same determinant?

I would like the answer to be depend only on $A$, not on $M$. I calculated an answer for 2×2 matrices, but I'm not happy with it.

For instance, if $A = \left(\begin{smallmatrix}2& 0 \\ 0& 1 \end{smallmatrix}\right)$, and $M = \left(\begin{smallmatrix}a& b \\ c& d \end{smallmatrix}\right)$, then M' = \left(\begin{smallmatrix}2a& b \\ c& d \end{smallmatrix}\right) and \det(M') = 2ad−bc. I can choose $B = \left(\begin{smallmatrix}1& 0 \\ 0& 2 \end{smallmatrix}\right)$ so that M'' = \left(\begin{smallmatrix}a& b \\ c& 2d \end{smallmatrix}\right) and \det(M'') = 2ad−bc = \det(M').

However, I'd like to understand more clearly the relationship between $A$ and $B$. Perhaps there is a fair amount of freedom in choosing $B$ and I have chosen a bad one in my work. What is a natural choice of $B$ (for general $A$)?

  • 0
    @Arturo: I replace the first row of$M$with M1⋅A (no transposes). If possible, I'd like to use row vectors. If you want to stick with column vectors, feel free to switch *all* my rows to columns, and replace the first column of$M$by A⋅M1.2011-03-14

2 Answers 2

10

Okay, I think I see what you want, and I claim that it is impossible to find an answer in the general case.

If you consider the $n\times n$ matrix $M$ as a collection of $n$ row vectors, $m_1\ldots m_n$ in $\mathbb{R}^n$, computing the determinant is equivalent to computing the exterior product

$ m_1 \wedge m_2 \wedge \ldots \wedge m_n $

Your operation seems to be trying to compare the above to

$ (Am_1) \wedge m_2 \wedge \ldots \wedge m_n $

and you are asking whether there exists a linear transformation $B$ such that

$ (Am_1) \wedge \ldots \wedge m_n = m_1\wedge (Bm_2) \wedge m_3 \wedge \ldots \wedge m_n $

for all sets of vectors $\{m_1,\ldots,m_n\}$. I claim that this is not possible.


Fix a set of $\{ m_1, m_2, \ldots, m_n\}$ linearly independent. Let m'_1 = m_1 + m_3. (Assume $n \geq 3$.)

Consider an arbitrary linear transformation $A$ which gives $Am_1 = m_2$, and Am'_1 = m_1. This set of linear equations can be solved with a possibly non-unique solution. So such $A$ exists.

Now, acting on the matrix $M$ corresponding to $\{m_1, \ldots, m_n\}$, the transformation $Am_1 = m_2$ implies that multiplying $A$ to $m_1$ makes the transformed matrix have determinant zero. Whereas the original matrix, being full rank, has non-zero determinant. So if $B$ were a linear transformation to act on $m_2$ such that

$ 0 = (Am_1) \wedge m_2 \ldots m_n = m_1\wedge (Bm_2) \wedge m_3 \ldots m_n $

one must have that $Bm_2$ is in the span of $m_1, m_3, \ldots m_n$. By construction, this implies that

m'_1 \wedge (Bm_2) \wedge m_3 \ldots m_n = 0

On the other hand, we have that

0 \neq m_1 \wedge m_2 \wedge m_3\ldots m_n = (Am'_1) \wedge m_2 \wedge m_3\ldots m_n

And so we have shown that for dimension $n \geq 3$, what you want to do is impossible (there are too many degrees of freedom).


For dimension 2, on the other hand, things are just right.

$\det M = M_{11} M_{22} - M_{12} M_{21} $

Acting on the first row by $A$, you have

\det M' = M_{11}A_{11}M_{22} + M_{12} A_{21} M_{22} - M_{12}A_{22} M_{21} - M_{11} A_{12} M_{21}

which we regroup as

\det M' = M_{11} ( A_{11} M_{22} - A_{12} M_{21} ) - M_{12} (A_{22} M_{21} - A_{21} M_{22} )

Which you can solve to get that

$ B = \begin{pmatrix} A_{22} & - A_{21} \\ - A_{12} & A_{11} \end{pmatrix} $

or that $B$ is the adjugate matrix of $A$.

(Here we actually geometrically used the fact that the Hodge star operator $*$ sends vectors to vectors in 2 dimensions, and so $B = *A^T*$ is what we want.)

  • 0
    @Jack: "does that wedge/dot/Hodge thing work in general?" Yes. Except on a general inner product space, you'll have to work over its exterior algebra, which means dealing with linear maps of alternating $k$-vectors, instead of just linear maps of vectors.2011-03-14
1

The case of only two rows is special. I don't think there is an answer independent of M with more than two rows.

If the first row of M happens to be zero, it doesn't matter what we do; the modified M will still have determinant zero.

Multiplying the first row of M by A (from the right) amounts to an arbitrary substitution of one row for another. If we assume A nonsingular, which is essential even to your two row solution, then we are only restricting ourselves to replacing one nonzero row by any other.

What you are asking is therefore to replace the first row of M (assumed nonzero) by another nonzero row, and compensate for the effect on $det$ M by doing something to the second row of M, without benefit of knowing anything about the rest of matrix M.

All that matrix A gives us is the replacement for the first row of M. All that matrix B defines for us is the replacement for the second row of M.

In the 2x2 case we can take B to be A scaled by the reciprocal of $det$ A. If M has three or more rows, there is no choice of B that would work for all matrices M.

An easy way to show this is to construct nonsingular A and M such that replacing the first row of M by its product with A produces a singular result regardless of what is done to the second row of M. For example, take A to be any nonsingular matrix other than a multiple of the identity matrix, so that for some row vector $u$, $u$ and $u$A will be linearly independent. Then take $v$ to be any row vector linearly independent of $u$ and $u$A, and define M to be the three rows $u$, $v$, and $u$A.

Then if you perform your operation on the first two rows of M, regardless of choice of B, the result will be a singular matrix M", since the first and third rows of M" are equal. Hence $det$ M" cannot be equal to $det$ M, since we constructed a nonsingular M.

Added: In the 2x2 case with A nonsingular, we know that multiplying both rows of M amounts to getting matrix product MA, whose determinant is ($det$ A)($det$ M). So if we use B = (1/$det$ A) A, it amounts to multiplying M on the right by A and then factoring out $det$ A from the second row of M, which gets us back to a 2x2 matrix with the same determinant as before.

Correcting my misunderstanding: Jack wants to require, given matrix A, a matrix B such that for any matrix M = $ \left( \begin{array} {c} u \\ v \end{array} \right) $, for:

M' = \left( \begin{array} {c} uA \\ v \end{array} \right)

M'' = \left( \begin{array} {c} u \\ vB \end{array} \right)

it is true that det M' = det M''. When my analysis is applied "backwards" to go from M' to M to M", the result is the same as Willie Wong's, namely B is the adjugate of A. For if we apply my recipe, changing A to $A^{-1}$ for going from M' to M, then:

$ B = (1/det A^{-1}) A^{-1} = adj(A) $

which as Willie pointed out is defined even for singular A.

  • 0
    @Jack Schmidt: That explains it. I thought $det$ M = $det$ M" was the goal, while you want $det$ M' = $det$ M". The discussion in terms of $A^{-1}$ rather than A would clear things up, since the relationship of M' to M is that the first row of M is the first row of M' multiplied by $A^{-1}$.2011-03-14