6
$\begingroup$

D.E. Rutherford shows that if a Boolean matrix $B$ has an inverse, then $B^{-1}= B^T$, or $BB^T=B^TB=I$.

I have two related questions:

  1. The only invertible Boolean matrices I can find are permutation matrices. Are there others?

  2. Is there an $O(n^2)$ test to determine if an $n \times n$ Boolean matrix $B$ has an inverse?

Note: The $O(n^2)$ Matlab function I gave here is wrong.

UPDATE:

I have posted a new $O(n^2)$ Matlab invertibility test here.

  • 1
    It is probably still an open problem. cstheory.stackexchange.com might give you the latest best algorithm known. You might also want to read: http://rjlipton.wordpress.com/2010/03/27/fast-matrix-products-and-other-amazing-results/2012-02-20
  • 2
    This question confused me at first: I suppose you are working over the semiring where addition is logical OR, i.e. 1+1=1, not over the field of order 2 where 1+1=0.2012-02-21
  • 1
    Sorry Nate, I should have made it clear that $B$ is a boolean or logical matrix whose elements are True or False with the operations AND, OR, and NOT.2012-02-21
  • 0
    Related question on MathOverflow: http://mathoverflow.net/questions/62125/invertible-matrices-of-natural-numbers-are-permutations-why2012-02-21
  • 0
    @Aryabhata and Yuval Thank you for the links.2012-02-21

3 Answers 3

1

At http://www.mathnet.or.kr/mathnet/thesis_file/15_B07-0905.pdf there's a paper, Song, Kang, and Shin, Linear operators that preserve perimeters of boolean matrices, Bull. Korean Math. Soc. 45 (2008) 355-363. At the top of page 356, it says, "It is well known that the permutation matrices are the only invertible Boolean matrices (see [1])." The reference is to Beasley and Pullman, Boolean-rank-preserving operators and Boolean rank-1 spaces, Linear Algebra Appl. 59 (1984) 55-77. I haven't attempted to track down the Beasley-Pullman paper.

  • 0
    (+1) There is also S. Z. Song, et al, [Linear operators that preserve Boolean ranks](http://icms.kaist.ac.kr/mathnet/kms_tex/73868.pdf), *Bull Korean Math. Soc.*, **36** (1999), no. 1., 131-138 which claims to extend the results of Beasley and Pullman.2012-02-21
  • 1
    The review of Beasley-Pullman in Math Reviews doesn't mention the invertible Boolean matrices result, and I found some other papers that make use of invertible Boolean matrices without noting that they must be permutations, so I'm not vouching for the Song-Kang-Shin citation, just noting it.2012-02-21
  • 1
    The Beasley and Pullman paper actually quotes *two* other papers for this result (in addition to proving it in their own Corollary 2.5.1(d) on page 61). The papers are: D. de Caen and D. A. Gregory, [Primes in the semigroup of Boolean matrices](http://www.sciencedirect.com/science/article/pii/0024379581901725), *Linear Algebra Appl.* **37**:119-134 (1981) and D. J. Richman and H. Schneider, [Primes in the semigroup of nonnegative matrices](http://www.math.wisc.edu/~hans/paper_archive/scanned_papers/hs045.pdf), *Linear and Multilinear Algebra* **2**:135-140 (1974).2012-02-21
  • 2
    It's rather obvious. Suppose $AB = BA = I$ where $A$ and $B$ are Boolean matrices. $A$ and $B$ must have at least one $1$ in each row and column. If $A_{ij} = 1$, then we must have $B_{jk} = 0$ for all $k \ne i$ and $B_{ki} = 0$ for all $k \ne j$. So $B$ has only one $1$ in each row and column, and is a permutation matrix.2012-02-21
  • 0
    Thank you all. That seems to answer my first question. Robert's proof is very neat.2012-02-21
  • 1
    @Derek: It answers both questions since a verification algorithm requires inspecting each entry of the matrix at most once.2012-02-21
  • 1
    If it's that easy, why didn't Rutherford see it?2012-02-21
  • 0
    I've looked at the Richman-Schneider paper. Lemma 2.2 seems to be the relevant result. It is stated without proof. It seems to be about matrices with real, non-negative entries, not Booleans, but Robert Israel's proof works in both contexts.2012-02-21
  • 0
    I have found an earlier cite for the result. It appears in E Burlacu, On the inverses of Boolean matrices, Bul Sti Tehn Inst Politehn Timisoara (N. S.) 14 (28) 1969 fasc. 1, 15-20, MR0256953 (41 #1608), where the reviewer, S Rudeanu, remarks, "The equivalence between permutation matrices and invertible square matrices with elements 0, 1 has been discovered independently by many people."2012-02-21
  • 0
    @Robert Israel - I guess that shows that if $A$ is Boolean, $A^{-1}$ exists, and $A^{-1}$ is Boolean, then $A$ is a permutation matrix. How about the statement that if $A$ is Boolean and $A^{-1}$ exists, then $A$ is a permutation matrix? Is that also simple?2012-02-21
  • 0
    @robinson, what does $A^{-1}$ mean, if it isn't Boolean? We're talking about inverse under Boolean multiplication, and if the entries of $A^{-1}$ aren't 0, 1 then how do you do the Boolean multiplication $AA^{-1}$? On the other hand, if all you mean is entries are 0, 1, with real multiplication, then $\pmatrix{1&1\cr0&1\cr}$ is invertible but not a permutation.2012-02-21
  • 0
    @cardinal - Using Robert Israel's result and your advice I have written an $O(n^2)$ Matlab function to test if the Boolean matrix $B$ is a permutation matrix. See the link in my Update above.2012-02-23
0

Edit: Wrong answer. See comments below.

For question 2, I am not aware of a result, but my intuition is that the answer is no. This is too long for a comment so I will post it as answer. Checking that $BB^T = I$ can be done in randomized $O(n^2)$ time (using random sampling as in Lipton's blogpost). But deterministically over any field, it needs $O(n^\omega)$ time using matrix matrix product [see these notes by A. Gupta]. If you present me with an algorithm for checking $B B^T = I$ in $O(n^2)$ then I can solve the problem in the course notes by A. Gupta, as follows. If $$ B = \begin{pmatrix} A && -D \\ C && I \end{pmatrix}$$ then $$ B B^T = \begin{pmatrix} \ldots && AC-D \\ \ldots && \ldots \end{pmatrix}.$$ I run your algorithm on $BB^T = I$. This is equivalent to deciding $AC -D = 0$, or $AC = D$ which we do not know how to solve in less than $O(n^\omega)$.

  • 1
    Hmm. But, if Gerry's answer is correct, there is a trivial $O(n^2)$ algorithm. Right? :)2012-02-21
  • 0
    @cardinal No. I can't see how. Gerry's answer is concerning question 1. So if permutation matrices are the only permutation matrices, then we need to check if the input $B$ is a permutation matrix. I can't see how you solve this in $O(n^2)$.2012-02-21
  • 0
    Oops. Typo. I meant: "So if permutation matrices are the only invertible Boolean matrices,..."2012-02-21
  • 1
    Do a single pass scan of each column noting the location of the entry that is one in each column and look for duplicates along the way?2012-02-21
  • 0
    If the only invertible Boolean matrices are permutation matrices then my second question becomes 'Is there an $O(n^2)$ test for a permutation matrix?'. I think the answer is yes. Just do as cardinal says. Or convert the permutation matrix into a permutation vector in $O(n^2)$ time and test the permutation vector in $O(n)$ time.2012-02-21
0

If addition is defined modulo two, it becomes xor, while multiplication becomes the 'and' operation. Using these with Gauss Jordan elimination yields that an inverse to $\begin{bmatrix}0 & 1 & 0\\0 & 1 & 1\\1 & 0 & 1\end{bmatrix}$ is $\begin{bmatrix}1 & 1 & 1\\1 & 0 & 0\\1 & 1 & 0\end{bmatrix}$, and multiplying out confirms that you do get the identity matrix, so that is at least one counter example.