5
$\begingroup$

I have a matrix which is of odd order and has exactly two ones in each row and column. The rest of the entries in each row/column are all zero. What will be the determinant of this matrix?

I believe the answer is $\pm 2$. My question is as to how is this derived?

Any help will be appreciated. Thanks.

4 Answers 4

5

As described by examples in J.M.'s answer and Paul's comment to Martin's answer, it is possible to get zero and $\pm2$. But that is not all of it. Consider matrices of the form (thanks to Martin for pointing out that I need to use an odd size) $ A=\pmatrix{P&0&0\cr0&P&0\cr0&0&P\cr}, $ where $P$ is the matrix from Paul's comment $ P=\pmatrix{1&1&0\cr1&0&1\cr0&1&1\cr}. $ Obviously $\det A=-8$, and it is clear that we can get any odd power of two in this way. I believe this covers all the possibilities: $0,\pm2^k$ for some odd natural number $k>0$.

Edit: Here's a proof that $\pm 2^{2t+1}, t\in\mathbf{N},$ and $0$ are all the possibilities. Do the following: Pick one of the 1s on row number $i_1$. Check the location of the other 1 on that row. Let the other 1 on that column be on row $i_2$. Keep doing this horizontal-vertical motion to define a set of rows $i_3,\ldots$. Sooner or later you get back to row number $i_1$. We have thus identified a subset of rows $i_1,\ldots,i_k$ for some integers $k>1$. By reordering the rows, we can bring these to the top. By reordering the columns we get a block looking exactly like the matrices in J.M.'s answer in the top-left corner. That block has determinant $2$ or $0$ as described by J.M. - according to the parity of $k$.

We may or may not have exhausted all the rows of the matrix. If we have not covered all of it, we build another block in a similar fashion. In the end we will have an odd number of blocks with an odd number of rows. The parity of the necessary row/column permutations gives us the sign.

  • 0
    David, J.M., thanks for fixing my typos and punctuation, guys!2011-12-30
4

NOTE: I originally thought that the possible values are $0$ or $\pm2$ and I've tried to correct my approach after seeing Jyrki's answer and comments. I do not claim that my answer is substantially different from his, but I decided to keep it, since it still might be useful for some users.

The determinant can be $0$ or $(\pm2)^k$. We will show this for all dimensions (not only odd ones).


We can show this by induction on $n$ for any $n\times n)$-matrix.

For $n=1,2,3$: By inspection.

Inductive step. Suppose that the claim is true for smaller matrices and we work wit $(n+1)\times(n+1)$-matrix of this form.

The first row $\vec a_1$ contains two ones. There are two possibilities.

a) There is a row $\vec a_i=\vec a_1$ for $i\ne 1$. Then the determinant is zero. (Since two rows of the matrix are identical.)

b) Find rows $\vec a_j$ and $\vec a_k$ containing ones in the same columns, where $\vec a_1$ has them. First suppose that the second $1$'s in the rows $\vec a_j$ and $\vec a_k$ are in different columns.

This means that $\vec a_j+\vec a_k-\vec a_1$ is a vector containing precisely two $1$'s.

We replace one of the three rows with the vector $\vec a_j+\vec a_k-\vec a_1$. This might change the sign of the determinant (depending on the row we choose.)

Now we get (after reordering rows and columns, which only changes the sign) a block matrix. One of the blocks is a $(n-1)\times(n-1)$-matrix which fulfills all the assumptions. The whole matrix looks like this
$\begin{vmatrix} 1 & 1 & 0 & \dots & 0\\ 1 & 0 & 1 & \dots & 0\\ 0 & 0 & * & * & * \\ 0 & 0 & * & * & * \\ 0 & 0 & * & * & * \end{vmatrix}$ where stars denote the submatrix fulfilling inductive hypothesis.

If we use Laplace expansion with respect to the first row, one of the determinants has zero column, hence it is zero and another one is the determinant of $(2n-1)\times(2n-1)$ submatrix.

c) Now suppose that the row $\vec a_j$ and $\vec a_k$ have the second $1$'s in the same column. This means that after reordering rows and columns we get a from these three rows and corresponding columns the submatrix of the form $\begin{pmatrix} 1&1&0\\ 1&0&1 \\ 0&1&1 \end{pmatrix}$. This means we have matrix of the form $\begin{pmatrix} 1&1&0&0&\ldots&0&\\ 1&0&1&0&\ldots&0&\\ 0&1&1&0&\ldots&0&\\ 0&0&0&*&\ldots&*&\\ 0&0&0&*&\ldots&*&\\ 0&0&0&*&\ldots&*& \end{pmatrix}$ where the submatrix marked by stars has form from the inductive hypothesis. So the determinant is the determinat of submatrix multiplied by $\begin{vmatrix} 1&1&0\\ 1&0&1 \\ 0&1&1 \end{vmatrix}=-2$.


Example:

$\begin{vmatrix} 1 & 1 & 0 & 0 & 0\\ 1 & 0 & 1 & 0 & 0\\ 0 & 1 & 0 & 1 & 0\\ 0 & 0 & 1 & 0 & 1\\ 0 & 0 & 0 & 1 & 1 \end{vmatrix}= \begin{vmatrix} 1 & 1 & 0 & 0 & 0\\ 1 & 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 1 & 0\\ 0 & 0 & 1 & 0 & 1\\ 0 & 0 & 0 & 1 & 1 \end{vmatrix} =1 \begin{vmatrix} 0 & 1 & 0 & 0\\ 0 & 1 & 1 & 0\\ 0 & 1 & 0 & 1\\ 0 & 0 & 1 & 1 \end{vmatrix} -1 \begin{vmatrix} 1 & 1 & 0 & 0\\ 0 & 1 & 1 & 0\\ 0 & 1 & 0 & 1\\ 0 & 0 & 1 & 1 \end{vmatrix} $

(In the first step we subtracted the first row from the third on and added the second row to the third one, the second step is Laplace expansion.)


Note: Other way of saying this is to say that by interchanging rows and columns you will obtain one of two situations:

$\begin{vmatrix} 1 & 1 & 0 & 0 & 0\\ 1 & 1 & 0 & 0 & 0\\ 0 & 0 & 1 & \ldots &\ldots \\\ 0 & 0 & \vdots & & \\ 0 & 0 & \vdots & & \end{vmatrix}$ or $ \begin{vmatrix} 1 & 1 & 0 & 0 & 0\\ 1 & 0 & 1 & 0 & 0\\ 0 & 1 & 0 & 1 & 0\\ 0 & 0 & 1 & 0 & 1\\ 0 & 0 & 0 & 1 & 1 \end{vmatrix} $

(NOTE: As was pointed out by Jyrki, the matrix can continue after the block of this form. It is explained nicely in his answer.)

I.e. either a submatrix in upper left corner makes determinant zero (and we do not have to care about the remaining entries) or the matrix has special form, as the one given in the above example (and determinants of such matrices can be computed by various methods).

This is essentially the same thing as I've done in the inductive proof above, but perhaps a different viewpoint might be useful.

  • 0
    @Paul I must have made mistake somewhere. I'll try to work this out.2011-12-27
4

I assume for the purposes of this answer that your matrices take the form

$\begin{pmatrix}1&&&&1\\1&1&&&\\&1&1&&\\&&\ddots&\ddots&\\&&&1&1\end{pmatrix}=\mathbf H+\mathbf I$

$\mathbf H$ is a Frobenius companion matrix, with characteristic polynomial $(-1)^n(x^n-1)$. The characteristic polynomial of $\mathbf H+\mathbf I$ is $p(x)=(-1)^n((x-1)^n-1)$ Thus, $\det(\mathbf H+\mathbf I)=p(0)=(-1)^n((-1)^n-1)=1-(-1)^n$.

The matrix itself is a special case of what is sometimes referred to as a "Forsythe matrix". It can be generated in MATLAB with the command gallery('forsythe',n,1,1).


Consider also the matrices

$\mathbf B=\begin{pmatrix}1&1&&&\\1&0&1&&\\&1&\ddots&\ddots&\\&&\ddots&0&1\\&&&1&1\end{pmatrix}$

It can be shown (though the relationship of tridiagonal determinants and three-term difference equations) that the characteristic polynomials of these matrices are given by

$\det(\mathbf B-\lambda\mathbf I)=(-1)^n (\lambda-2)\frac{\sin\left(n\arccos\frac{\lambda}{2}\right)}{\sin\left(\arccos\frac{\lambda}{2}\right)}=(-1)^n (\lambda-2)U_{n-1}\left(\frac{\lambda}{2}\right)$

where $U_k(x)$ is the Chebyshev polynomial of the second kind. $\det\mathbf B$ is thus given by $2(-1)^{n+1}\sin\dfrac{n\pi}{2}$.

  • 0
    Thanks @Jyrki! I think I've expended my bestiary of "known" matrices that are relevant to this question, though. :)2011-12-29
0

Only matrix from $n\times n$ type have a determinant. I can't understand you very well, but if your matrix is like $\begin{vmatrix} a&b \\ c&d \end{vmatrix}$ then your determinant is $ad-cb$.

P.S. I don't know how to make it to look like matrix here.

  • 0
    the matrix has order n x n of course. By odd order I meant that n is odd. For example, the matrix may be 3 x 3 or 5 x 5 etc. In addition each row contains 1 in exactly two places and 0 in the other n-2 places. Similarly for each column.2011-12-27