4
$\begingroup$

Consider the set of $N\times N$ matrices that satisfy the property $\mathcal{H} = \{H\,|\, H_{ij}=H_{N+1-i,N+1-j}, \det H \neq 0\}$ or in matrix forms $\begin{pmatrix}a_{1} & a_{2} & \cdots & a_{N-1} & a_{N}\\ b_{1} & b_{2} & \cdots & b_{N-1} & b_{N}\\ \vdots & \vdots & \ddots & \vdots & \vdots\\ b_{N} & b_{N-1} & \cdots & b_{2} & b_{1}\\ a_{N} & a_{N-1} & \cdots & a_{2} & a_{1} \end{pmatrix}$

Do these matrices have a name? Do they form a group?

It can be easily shown that the identity is in the group, i.e. $I\in \mathcal{H}$. Also the set is closed under multiplication, i.e. if $A\in \mathcal{H}$ and $B\in \mathcal{H}$ then $AB\in\mathcal{H}$. To see why, consider

$\begin{align*} (AB)_{mn}&=\sum_i A_{mi}B_{in}=\sum_i A_{N+1-m,N+1-i}B_{N+1-i, N+1-n}\\ &=\sum_{i^\prime} A_{N+1-m,i^\prime}B_{i^\prime, N+1-n}=(AB)_{N+1-m,N+1-n} \end{align*}$

The last property that needs to be shown for the set to be a group is that the inverse is also in the set, i.e. if $A\in\mathcal{H}$, is it true that $A^{-1}\in\mathcal{H}$?

EDIT: Added the condition that matrices should also be invertible.

  • 0
    Thanks. I edited the title.2011-07-27

3 Answers 3

7

Just a quick version of centrosymmetric matrices (thanks to Michael Banaszek for pointing this out):

The matrices you describe in your original question form what is called the centralizer of the involution $J=\begin{bmatrix} . & . & \dots & . & 1 \\ . & . & \dots & 1 & . \\ & & & & \\ . & 1 & \dots & . & . \\ 1 & . & \dots & . & . \end{bmatrix} \qquad J_{ij} = \begin{cases} 1 & \text{if } i=N-j+1 \\ 0 & \text{otherwise} \end{cases} \qquad \mathcal{H} = \left\{ A : AJ = JA \right\}$

The centralizer is always an algebra: it is closed under scalar multiplication, addition, and multiplication. An invertible matrix is contained in an algebra if and only if its inverse is. In particular, the group of units of the algebra is the group you are asking about.

In finite groups, the centralizers of involutions are quite important, and this particular involution is called “the longest element of the Weyl group”, and so is reasonably important.

  • 0
    Addendum: the matrix $J$ in Jack's answer is often termed as the "exchange matrix".2011-07-28
4

Edit. This addresses the question as originally asked, without the invertibility condition.

$\left(\begin{array}{ccc} 0 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 0 \end{array}\right)$ is of the desired form, and is not invertible.


For the invertible case: note that your set is closed under transposing matrices, and under multiplication by (nonzero) scalars. Since the inverse of $A$ can be computed with the adjugate, which is a scalar multiple of the transpose of the matrix of cofactors, your question comes down to determining whether the $(i,j)$-cofactor of a matrix of this form is equal to the $(N+1-i,N+1-j)$-cofactor.

Let $B_{ij}$ be the matrix obtained from $A$ by removing the $i$th row and the $j$th column. That is, we are trying to compare $\det(B_{ij})$ with $\det(B_{N+1-i,N+1-j})$.

I claim that the $(r,s)$ entry of $B_{ij}$ equals the $(N-r,N-s)$ entry of $B_{N+1-i,N+1-j}$.

What is the $(r,s)$ entry of $B_{ij}$?

  • If $r\lt i$ and $s\lt j$, then it's $a_{ij}$;
  • If $r\lt i$ and $s\geq j$, then it's $a_{i,j+1}$;
  • If $r\geq i$ and $s\lt j$, then it's $a_{i+1,j}$;
  • If $r\geq i$ and $s\geq j$, then it's $a_{i+1,j+1}$.

If $r\lt i$ and $s\lt j$, then the $(r,s)$ entry of $B_{ij}$ is $a_{rs}= a_{N+1-r,N+1-s}$. Since $(r,s)$ is to the left and above of $(i,j)$, then $(N+1-r,N+1-s)$ is to the right and below $(N+1-i,N+1-j)$, and hence the $(N-r,N-s)$ entry of $B_{N+1-i,N+1-j}$ is $a_{N+1-r,N+1-s}=a_{rs}$, as desired.

If $r\lt i$ and $s\geq j$, then the $(r,s)$ entry of $B_{ij}$ is $a_{r,s+1} = a_{N+1-r,N-s}$. Since $(r,s+1)$ is the left and at below of $(i,j)$, then $(N+1-r,N-s)$ is to the right and above $(N+1-i,N+1-j)$, so $a_{N+1-r,N-s}$ becomes the $(N-r,N-s)$ entry of $B_{N+1-i,N+1-j}$, as desired. A symmetric argument holds if $r\geq i$ and $s\lt j$.

If $r\geq i$, $s\geq j$, then the $(r,s)$ entry of $B_{ij}$ is $a_{r+1,s+1} = a_{N-r,N-s}$. Since $(r+1,s+1)$ is to the right and below $(i,j)$, then $(N-r,N-s)$ is to the left and above $(N+1-i,N+1-j)$, so the $(N-r,N-s)$ entry of $B_{N+1-i,N+1-j}$ is $a_{N-r,N-s}$, as desired.

So the question comes down to whether the determinant of an $N\times N$ matrix is invariant under the transformation that maps the $(i,j)$ entry to the $(N+1-i,N+1-j)$th entry. This is achieved through a series of row and column exchanges: exchange first row with last row; second row with penultimate row; third row with antepenultimate row; etc.; and then exchange first column with last column; second column with penultimate column; third column with ante-penultimate column; etc. In the end, we have performed an even number of row/column exchanges, each of which multiplies the determinant by $-1$. So the two matrices have the same determinant.

Therefore, if $A\in\mathcal{H}$, then the $(i,j)$-cofactor of $A$ equals the $(N+1-i,N+1-j)$-cofactor of $A$; the cofactor matrix of $A$ lies in $\mathcal{H}$; hence the adjugate matrix of $A$ lies in $\mathcal{H}$; hence the inverse of $A$ lies in $\mathcal{H}$. Thus, $\mathcal{H}$ forms a group.

3

Edit: This answers the question before the condition that det(A)>0 was added.

Though it fulfills the other two properties, matrices of this form can't form a group. Just consider any matrix with determinant zero, with the simplest example being a matrix consisting solely of zeroes. This doesn't even have an inverse, so its inverse can't be in $\mathcal{H}$.

  • 0
    Thanks for the lead. There is actuall$y$ a paper dedicated to the inverse of such matrices http://www.jstor.org/stable/12673392011-07-27