2
$\begingroup$

Denote the system in $GF(2)$ as $Ax=b$, where: \begin{align} A=&(A_{ij})_{m\times m}\\ A_{ij}=& \begin{cases} (1)_{n\times n}&\text{if }i=j\quad\text{(a matrix where entries are all 1's)}\\ I_n&\text{if }i\ne j\quad\text{(the identity matrix)} \end{cases} \end{align} that is, $A$ is a square matrix of order $m\times n$. And $b$ is a 0-1 vector with length $m\times n$. Now what is the solution of this system, if any, for a general pair of $m$ and $n$?

Example: For $m=2,n=3$ and $b=(0, 1, 0, 0, 1, 0)^T$, we have $ A= \begin{pmatrix} 1 & 1 & 1 & 1 & 0 & 0 \\ 1 & 1 & 1 & 0 & 1 & 0 \\ 1 & 1 & 1 & 0 & 0 & 1 \\ 1 & 0 & 0 & 1 & 1 & 1 \\ 0 & 1 & 0 & 1 & 1 & 1 \\ 0 & 0 & 1 & 1 & 1 & 1 \end{pmatrix} $ then one solution is $x=(1, 0, 1, 0, 1, 0)^T$

I know Gaussian elimination. I am trying but find it not very easy when dealing with a general case.

  • 0
    In your example case $m=2,n=3$ you get a constraint that for a solution to exist the vector $b$ must have an even number of ones. If that is so, there will be two solutions, these are bitwise complements of each other. This reflects the fact that the all 1s vector spans the kernel of $x\mapsto Ax$. Something tells me that Henning has made some headway in the general case... :-)2011-09-25

3 Answers 3

1

Edit: I posted a wrong answer earlier. Hope I can get things fixed this time.

For convenience, write $x^T = (x_1^T, \ldots, x_m^T)$ where each $x_i$ is a vector of length $n$. Similarly, write $b^T = (b_1^T, \ldots, b_m^T)$. Let $J$ and $u$ be respectively the $n$-by-$n$ matrix and $n$-vector with all entries equal to 1 and let $Jb_i=\beta_iu$. Your system of equations $Ax=b$ is equivalent to $ (\dagger): (J-I)x_i+\sum_{j=1}^m x_j= b_i\quad \forall i. $ Let $\ Jx_i=\alpha_iu$ where $\alpha_i\in GF(2)$ is the parity of $x_i$. So $(\dagger)$ gives $ (*): x_i = \sum x_j + b_i + \alpha_iu\quad \forall i. $

Case 1: $m$ is even. By summing up $(*)$ from $i=1,2,\ldots,m$, we get $\sum x_j = \sum b_j + \sum\alpha_ju.$ Substitute this back into $(*)$, we see that the general solution to $(\ast)$ is of the form $ (**): x_i = \sum b_j + \sum\alpha_ju + b_i + \alpha_iu. $ Such $\{x_i\}$ form a solution of $(\dagger)$ if and only if $Jx_i=\alpha_iu$ for all $i$, which means $ \sum \beta_j + n\sum\alpha_j + \beta_i + n\alpha_i = \alpha_i $ or equivalently, $ n\sum\alpha_j + (n-1)\alpha_i = \sum \beta_j + \beta_i. $ When $n$ is even, the above system has a unique solution $\alpha_i = \sum \beta_j + \beta_i$.

When $n$ is odd, the above system reduces to $\sum\alpha_j = \sum \beta_j + \beta_i$. Hence solution exists if and only if $\beta_1=\ldots=\beta_m=\beta$ and the solutions are given by $(**)$ with $\sum\alpha_j = \beta$.

Case 2: $m$ is odd. By summing up $(*)$ from $i=1,2,\ldots,m$, we get $\sum b_j = \sum\alpha_ju.$ Thus a necessary condition for a solution to exist is that $\sum b_j$ is a multiple of $u$ (say, $\sum b_j=\lambda u$). If this is the case, then the general solution to $(\ast)$ is given by $x_i = y + b_i + \alpha_iu$ where $\sum \alpha_j=\lambda$ and $y$ is any $n$-vector. Such $\{x_i\}$ is a feasible solution to $(\dagger)$ if and only if $Jx_i=\alpha_iu$ for all $i$, that is, iff $Jy + \beta_iu + n\alpha_iu=\alpha_iu$. Therefore, we need $(n-1)\alpha_i+\beta_i$ to be constant and equal to the parity of $y$.

So, when $n$ is odd, solution exists only if $\beta_1=\ldots=\beta_m$. Since $ \sum b_j=\lambda u\ \Rightarrow\ \sum Jb_j=\lambda Ju\ \Rightarrow\ \sum\beta_j=\lambda, $ the previous requirement that $\sum \alpha_j=\lambda$ can be rewritten as $\sum \alpha_j=\sum \beta_j$.

When $n$ is even, that $(n-1)\alpha_i+\beta_i$ is constant means $\alpha_i+\beta_i=c$ for some constant $c$. Recall that we need $\sum \alpha_ju=\lambda u=\sum b_j$. Multiply both sides by $J$, we get $0=\sum \beta_ju$, i.e. $\sum \beta_j=0$. This is another necessary condition for the existence of solution. Suppose this is also satisfied. To make $\sum \alpha_ju=\lambda u$, we may take $\alpha_i=\beta_i$ for all $i$ if $\lambda=0$, or $\alpha_i=1-\beta_i$ for all $i$ if $\lambda=1$.

  • 0
    Thank you, @user1551. Actually the question is from a variant of [lights out puzzle](http://en.wikipedia.org/wiki/Lights_Out_(game)), where the light pressed affects the whole row and column.2011-10-17
2

Some observations, too long for a comment:

If $n$ is odd, your matrix is not invertible, and so there is no solution for arbitrary $b$ (and a solution will not be unique if it exists). First, do some row operations to rewrite the constituent blocks to $\pmatrix{1&1&1&1\\0&0&0&0\\0&0&0&0\\0&0&0&0}, \pmatrix{1&0&0&0\\1&1&0&0\\1&0&1&0\\1&0&0&1}$ Then do some column operations to rewrite the blocks to $\pmatrix{n\bmod 2&1&1&1\\0&0&0&0\\0&0&0&0\\0&0&0&0}, \pmatrix{1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1}$ But if $n$ is odd, then columns $1$, $n+1$, $2n+1$, .. $(m-1)n+1$ are now all identical.

On the other hand, if $m$ is odd, then permuting the indices will turn the $m\times n$ problem into an $n\times n$ problem from the same family, and that will not be invertible either.

Some further progress in the even case (before I noticed Robert's elegant solution): We have the blocks $\pmatrix{0&1&1&1\\0&0&0&0\\0&0&0&0\\0&0&0&0}, \pmatrix{1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1}$ Further column operations give $\pmatrix{0&1&0&0\\0&0&0&0\\0&0&0&0\\0&0&0&0}, \pmatrix{1&0&0&0\\0&1&1&1\\0&0&1&0\\0&0&0&1}$ and by row operations we can blank out the new 1's to the right: $\pmatrix{0&1&0&0\\0&0&0&0\\0&0&0&0\\0&0&0&0}, \pmatrix{1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1}$ Now everything decouples into one $2m\times 2m$ problem (containing the first two rows and columns of each block and $n-2$ separate $m\times m$ problems.

  • 0
    No, that was a typo on my part. I meant that the so-numbered _columns_ are identical.2011-10-07
2

An empirical formula: it looks to me like $\det(A) = (-1)^{(m-1)(n-1)} (n+m-1)(n-1)^{m-1}(m-1)^{n-1}$. In particular, $\det(A) \equiv 0 \mod 2$ when $m$ or $n$ is odd (except in the case $m=n=1$) so, as Henning found, $A$ will not be invertible over GF(2) in those cases. However, it will be invertible when $m$ and $n$ are both even.

EDIT: the situation over $GF(2)$ for even $m$ and $n$ is simpler than I thought. Consider $A^2$. Let $U$ be the $n \times n$ matrix of all 1's. Note that $U^2 = n U$. The $(i,i)$ block of $A^2$ is $U^2 + (m-1) I = n U + (m-1) I$. The $(i,j)$ block for $i \ne j$ is $2 U + (m-2) I$. If $m$ and $n$ are even, the $(i,i)$ block mod 2 is $I$ and the $(i,j)$ block mod 2 is $0$ (i.e. the inverse of $A$ over $GF(2)$ is $A$).

  • 0
    An insightful answer. It is brilliant to consider $A^2$. To solve the puzzle I have to handle all the cases so I give the "√" to @user1551, but this is impressive.2011-10-17