28
$\begingroup$

This is a question from the free Harvard online abstract algebra lectures. I'm posting my solutions here to get some feedback on them. For a fuller explanation, see this post.

This problem is from assignment 4.

Prove that the transpose of a permutation matrix $P$ is its inverse.

A permutation matrix $P$ has a single 1 in each row and a single 1 in each column, all other entries being 0. So column $j$ has a single 1 at position $e_{i_jj}$. $P$ acts by moving row $j$ to row $i_j$ for each column $j$. Taking the transpose of $P$ moves each 1 entry from $e_{i_jj}$ to $e_{ji_j}$. Then $P^t$ acts by moving row $i_j$ to row $j$ for each row $i_j$. Since this is the inverse operation, $P^t=P^{-1}$.

Again, I welcome any critique of my reasoning and/or my style as well as alternative solutions to the problem.

Thanks.

  • 1
    @jobrien929: I suspect that trying to write it out carefully would just lead to precisely my suggestion, considering transpositions or products of transpositions. Otherwise, keeping track of all the row shuffles is going to be a pain. *Clarification*: you may want to show only that if $\tau$ is a *transposition* and $\sigma$ a *permutation*, then $P_{\tau}P_{\sigma} = P_{\tau\sigma}$, rather than trying to prove it for any two permutations. That should be enough.2019-01-10

5 Answers 5

30

A direct computation is also fine: $(PP^T)_{ij} = \sum_{k=1}^n P_{ik} P^T_{kj} = \sum_{k=1}^n P_{ik} P_{jk}$ but $P_{ik}$ is usually 0, and so $P_{ik} P_{jk}$ is usually 0. The only time $P_{ik}$ is nonzero is when it is 1, but then there are no other i' \neq i such that P_{i'k} is nonzero ($i$ is the only row with a 1 in column $k$). In other words, $\sum_{k=1}^n P_{ik} P_{jk} = \begin{cases} 1 & \text{if } i = j \\ 0 & \text{otherwise} \end{cases}$ and this is exactly the formula for the entries of the identity matrix, so $PP^T = I$

  • 1
    Funny that we independently come up with almost identical answers. Since it seems you beat me to it, I can delete mine if you want.2012-01-12
9

Another way to prove it is to realize that any permutation matrix is the product of elementary permutations, where by elementary I mean a permutation that swaps two entries. Since in an identity matrix swapping $i$ with $j$ in a row is the same as swapping $j$ with $i$ in a column, such matrix is symmetric and it coincides with its inverse. Then, assuming $P=P_1\cdots P_k$, with $P_1,\ldots,P_k$ elementary, we have

$ P^{-1} = (P_1\cdots P_k)^{-1}=P_k^{-1}\cdots P_1^{-1}=P_k\cdots P_1=P_k^t\cdots P_1^t = (P_1\cdots P_k)^t=P^t $

  • 1
    That's an exercise 3.9.4 in Matrix Analysis (http://matrixanalysis.com). Elementary permutation matrix (Type I) looks like this: $P = I−uu^{T}$, so you can show that $P = P^{-1} = P^{T}$. And then apply logic from this answer.2018-05-29
6

Using a little knowledge about orthogonal matrices the following proof is pretty simple:

Since $v^tw=\sum_{k=0}^nv_iw_i$ if $v=(v_1,...,v_n),w=(w_1,...,w_n)$ we have $v^tv=1$ whenever v is a column of $P$. On the other hand $v^tw=0$ if $v$ and $w$ are two distinct columns of $P$. Therefore we can conclude that $(P^tP)_{i,j}=\delta_{i,j}$ and so $P^t=P^{-1}$.

4

Less sophisticated, you could just crunch it out.

First, a lemma:

The inverse of a matrix, if it exists, is unique.

Proof: If both $B$ and $C$ are inverse to $A$, then we have $B = BI = B(AC) = (BA)C = IC = C$ so $B = C$. (Here, $I$ denotes the identity matrix).

Using this, it follows in our specific case that in order to show $A^T = A^{-1}$, we need only show $A^TA = AA^T = I$.

Assume $i\neq j$. Then $(AA^T)_{ij} = \sum_k A_{ik}A^T_{kj} = \sum_k A_{ik}A_{jk}$. But for each $k$, $A_{ik}A_{jk} = 0$ since there is only one nonzero entry in the $k$th row and $i\neq j$ (so $A_{ik}$ and $A_{jk}$ can't both be the nonzero entry). So, $(AA^T)_{ij} = 0$ when $i\neq j$.

The argument that $(A^TA)_{ij} = 0$ when $i\neq j$ is almost identical, but uses the fact that the columns of $A$ contain only one nonzero entry.

Can you see what happens when, instead, $i = j$?

  • 1
    Looks good to me too. A survey of style is probably good for this kind of question.2012-01-12
3

Let $π$ be a permutation on $n$ objects and

$\begin{equation} \pi=\left(\begin{matrix} 1 & 2 &\ldots& n \\ \pi(1) & \pi(2) &\ldots& \pi(n) \end{matrix} \right) \end{equation}$

Assume that $P_π$ be a permutation matrix. We need to prove that $P_π^T P_π=I$.

Note that, $π$ sends the $i$th row of the identity matrix to the $π(i)$th row, i.e.,

$\begin{eqnarray*} P_\pi=[P_{ij}]=\left\{ \begin{array}{ll} 1; & i=\pi(j)\\ 0; & i \ne \pi(j). \end{array} \right. \end{eqnarray*}$

The $ij$th component of $P_\pi^TP_\pi$ is

$\begin{eqnarray} (P_\pi^TP_\pi)_{ij}&=&\sum_{k=1}^n P^T_{ik}P_{kj}\\ &=&\sum_{k=1}^n P_{ki}P_{kj}\\ &=& P_{\pi(j)i}P_{\pi(j)j}\\ &=& P_{\pi(j)i}=\left\{ \begin{array}{ll} 1; & i=j\\ 0; & i \ne j. \end{array} \right. \end{eqnarray}$

Therefore, $P^T_\pi P_\pi=I$.

  • 0
    nice and beautiful2018-10-21