1
$\begingroup$

I am having problems with this linear algebra proof:

Let $ A $ be a square matrix of order $ n $ that has exactly one nonzero entry in each row and each column. Let $ D $ be the diagonal matrix whose $ i^{th} $ diagonal entry is the nonzero entry in the $i^{th}$ row of $A$

For example:

$A = \begin{bmatrix}0 & 0 & a_1 & 0\\a_2 & 0 & 0 & 0\\0 & 0 & 0 & a_3 \\0 & a_4 & 0 & 0 \end{bmatrix} \quad $ $D = \begin{bmatrix}a_1 & 0 & 0 & 0\\0 & a_2 & 0 & 0\\0 & 0 & a_3 & 0\\0 & 0 & 0 & a_4 \end{bmatrix}$

A permutation matrix, P, is defined as a square matrix that has exactly one 1 in each row and each column

Please prove that:

  1. $ A = DP $ for a permutation matrix $ P $
  2. $ A^{-1} = A^{T}D^{-2} $

My attempt:

For 1, I tried multiplying elementary matrices to $ D $ to transform it into $ A $:

$$ A = D * E_1 * E_2 * \cdots * E_k $$

Since I am performing post multiplication with elementary matrices, the effect would be a column wise operation on D. But I can't see how this swaps the elements of $ D $ to form $A$. I also cannot prove that the product of the elementary matrices will be a permutation matrix.

For 2, my attempt is as follows (using a hint that $PP^{T} = I$):

$$ \begin{aligned} A^{T}D^{-2} &= (DP)^{T}D^{-2} \\ &= (P^{T})(D^{T})(D^{-1})(D^{-1}) \\ &= (P^{-1})(D^{T})(D^{-1})(D^{-1}) \end{aligned} $$

I am not sure how to complete the proof since I cannot get rid of the term $D^{T}$.

Could someone please advise me on how to solve this problem?

2 Answers 2

1

Hint: For $(1)$, find a matrix $P(i,j)$ that swaps columns $i$ and $j$. Your permutation matrix will be a product of $P(i,j)$'s.

For $(2)$, try to convince yourself that when $D$ is diagonal, $D^{T}=D$. It's not too hard!

  • 0
    How do I prove that the product of P(i,j)s is still a permuation matrix?2017-02-09
  • 0
    The product of permutation matrices corresponds to the composition of permutations. The composition of permutations is a permutation.2017-02-09
  • 0
    So that means that it is known fact and I don't have to prove it?2017-02-09
  • 0
    I mean, that depends on your teacher/professor. That said, that the composition of permutations is itself a permutation *is* a pretty basic and well-known fact, and you should be able to see for yourself! Spare it a moment's thought and I'm sure you will.2017-02-09
  • 0
    Got it. Is it always possible to transform an arbitrary matrix D into A with a series of permutation matrices, given the definition of A and D in the question?2017-02-09
  • 0
    Yes! ${}{}{}{}$2017-02-09
0

For part one, you have the right idea. Try multiplying $A$ on the right by various permutation matrices, and just see what happens. For example, multiplying on the right by this matrix swaps columns 1 and 3. \begin{equation} \begin{bmatrix} 0 & 0 & a_1 & 0 \\ a_2 & 0 & 0 & 0 \\ 0 & 0 & 0 & a_3 \\ 0 & a_4 & 0 & 0 \end{bmatrix} \begin{bmatrix} 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix} =\begin{bmatrix} a_1 & 0 & 0 & 0 \\ 0 & 0 & a_2 & 0 \\ 0 & 0 & 0 & a_3 \\ 0 & a_4 & 0 & 0 \end{bmatrix} \end{equation} You can find similar matrices to do more column swaps to get to $D$. You don't need to prove in general that these column swap matrices are necessarily permutation matrices (although they are), since all this problem asks you to do is write down a matrix $P$.