46
$\begingroup$

Here's a cute problem that was frequently given by the late Herbert Wilf during his talks.

Problem: Let $A$ be an $n \times n$ matrix with entries from $\{0,1\}$ having all positive eigenvalues. Prove that all of the eigenvalues of $A$ are $1$.

Proof:

Use the AM-GM inequality to relate the trace and determinant.

Is there any other proof?

  • 3
    Huh. Anybody got a different proof? ($\mathbb{F}_2$-representation? Adjacency matrix of a graph?)2012-12-20
  • 0
    @AlexanderGruber I would love to hear one!2012-12-20
  • 2
    Hmmm...What makes a problem "cute"? I don't mean that sarcastically, I've just heard a handful of references lately to "cute" problems, and am curious about what makes a problem "cute"?2012-12-20
  • 1
    @amWhy Short, elegant, and usually elementary.2012-12-20
  • 0
    Are you looking for a proof that uses the hint? Or a different proof?2012-12-20
  • 1
    @MikeSpivey Well, I know how to do it with the hint, so ideally an alternate proof.2012-12-20
  • 0
    can we assume $A$ to be symmetric? because all eigenvalues are real and positive.2012-12-20
  • 1
    @dineshdileep: The only symmetric 0-1 matrix with all eigenvalues real and positive is the identity, so that assumption makes the problem easy. :)2012-12-20
  • 0
    that's what I thought!!, if he agreed, I would have jumped in with that :):)2012-12-20
  • 2
    According to [this paper](https://cs.uwaterloo.ca/journals/JIS/VOL7/Sloane/sloane15.pdf), every $n\times n$ binary matrix with positive eigenvalues is permutation similar to an upper triangular binary matrix with ones on the diagonal.2012-12-20
  • 0
    Does it mean, it is an irreducible matrix?2012-12-20
  • 0
    @user1551 The question OP asked is stated as section 2 corollary (i) in that paper. But I believe, they used the same method OP wanted to avoide2012-12-20
  • 0
    I know. I just want to say that the symmetric case is way too special. BTW, every triangular matrix *is* reducible by definition.2012-12-20
  • 0
    oh,,,I don't know much about it...2012-12-20
  • 0
    Eigenvalues are, in general, complex. "having all positive eigenvalues" means "real and positive" or "positive real part"?2013-02-08
  • 1
    @leonbloy The adjectives 'positive' and 'negative' can refer only to real numbers. I mean real and positive.2013-02-08

3 Answers 3

8

If one wants to use the AM-GM inequality, you could proceed as follows: Since $A$ has all $1$'s or $0$'s on the diagonal, it follows that $tr(A)\leq n$. Now calculating the determinant by expanding along any row/column, one can easily see that the determinant is an integer, since it is a sum of products of matrix entries (up to sign). Since all eigenvalues are positive, this integer must be positive. AM-GM inequality implies $$det(A)^{\frac{1}{n}}=\left(\prod_{i}\lambda_{i}\right)^{\frac{1}{n}}\leq \frac{1}{n}\sum_{i=1}^{n}\lambda_{i}\leq 1.$$ Since $det(A)\neq 0$, and $m^{\frac{1}{n}}>1$ for $m>1$, the above inequality forces $det(A)=1$. We therefore have equality which happens precisely when $\lambda_{i}=\lambda_{j}$ for all $i,j$. Combining this with the above equality gives the result.

  • 0
    I guess you said you already knew how to do it this way.2013-12-16
  • 0
    How do we know that the only values of $\lambda_i$ which satisfy $\sum_i \lambda_i = n$, $\prod_i \lambda_i = 1$, and $\lambda_i > 0$ are $\lambda_i = 1$ for all $i$?2015-07-22
  • 1
    @David: It was also proved that $\lambda_i=\lambda_j$ for all $i,j$. That yields $\sum_i\lambda_i=n\lambda_1=n$.2015-07-22
0

Suppose that A has a column with only zero entries, then we must have zero as an eigenvalue. (e.g. expanding det(A-rI) using that column). So it must be true that in satisfying the OP's requirements we must have each column containing a 1. The same holds true for the rows by the same argument. Now suppose that we have a linear relationship between the rows, then there exists a linear combination of these rows that gives rise to a new matrix with a zero row. We've already seen that this is not allowed so we must have linearly independent rows. I am trying to force the form of the permissible matrices to be restricted enough to give the result. Linear independence of the rows gives us a glimpse at the invertability of such matrices but alas not its diagonalizability. The minimal polynomial $(1-r)^n$ results from upper or lower triangular matrices with ones along the diagonal and I suspect that we may be able to complete this proof by looking at what happens to this polynomial when there are deviations from the triangular shape. The result linked by user1551 may be the key. Trying to gain some intuition about what are the possibilities leads one to match the binomial theorem with Newton's identities: http://en.wikipedia.org/wiki/Newton%27s_identities#Expressing_elementary_symmetric_polynomials_in_terms_of_power_sums

and the fact that the trace must be $n$(diagonal ones) and the determinant 1. I would like to show that any deviation from this minimal polynomial must lead to a non-positive eigenvalue. Two aspects of the analysis, a combinatorial argument to show what types of modifications(from triangular) are permissible while maintaining row/column independence and looking at the geometrical effects of the non-dominant terms in the resulting characteristic polynomials. Maybe some type of induction argument will surface here.

  • 0
    @Potato Fixed. Ty2013-03-31
  • 0
    there are non-triangular matrices that have the minimal polynomial e.g. (1100,0100,1111,1101). Will need a diff approach.2013-03-31
  • 0
    Newton's theorem on logarithmically convex polynomials surfaced in a number theory context today. Does anyone know of similar results for polynomials that have alternating coefficients? Still working on these characteristic polynomials.2013-04-04
  • 0
    These are some interesting thoughts, but I don't think this answers the question in its current form.2013-11-21
-2

Since $\det(A) \neq 0$, $A$ is nonsingular and all diagonals of $A$ must equal one. Next, we know the relationship between the determinant and trace is \begin{align*} \det(A) &= \exp(tr(\ln(A))) \\ &= \exp(0) \\ &= \exp(\sum_i \ln(\lambda_i)) \\ \end{align*}. Step 2 holds because all diagonals of $A$ are one, so their log is zero, and step 3 because natural logs are analytic. This suggests $$\sum_i \ln(\lambda_i) = 0$$, or, $$\prod_i \lambda_i = 1$$.

Then, because all diagonals of $A$ are one, $tr(A) = \sum_i \lambda_i = n$ must be true. Hence, the AM-GM inequality states $$ \left[ \prod_{i} \lambda_i \right]^{\frac{1}{n}} \leq \frac{1}{n} \sum_{i} \lambda_{i}$$, with equality holding only when $\lambda_i = \lambda_j$ $\forall i,j$. Thus, $$ 1^{\frac{1}{n}} = \frac{1}{n} \cdot n$$, and all eigenvalues of $A$ must be equal. Since $\prod_i \lambda_i = 1$, all eigenvalues must be one.

  • 1
    Hmm ... (1) See the OP's comment to his/her question. The OP already knows the AM$\ge$GM trick, and he/she is looking for a different proof. (2) In your first line, why must all diagonal entries of $A$ equal to one? (3) If you want to play the AM$\ge$GM trick, you may do so without most of the sidetracks in your proof. See, e.g., derivation (1) on p.2 of [the paper](https://cs.uwaterloo.ca/journals/JIS/VOL7/Sloane/sloane15.pdf) I linked to in my previous comment to the OP's question.2012-12-26
  • 0
    RE: (1), I went back and read the comments, and that's a good question - what are some of the other ways to arrive at such a result? RE: (2), matrix elements can be either 0 or 1. If a diagonal element is zero, then the matrix is singular and the determinant is therefore also zero, thus all eigenvalues cannot be positive. RE: (3), that paper definitely spells out a more direct route, thanks for sharing it.2012-12-26
  • 8
    Re:(2), $B=\begin{pmatrix}0&1\\1&0\end{pmatrix}$ is a 0-1 matrix that has a zero diagonal, but it is nonsingular and one of its eigenvalue is equal to $1$. Certainly, the $B$ in this counterexample is indefinite, and it turns out that any $A$ that satisfies all assumptions of the OP's question must have a diagonal of ones, but the point is, you seem to have jumped to this conclusion on the first line of your proof without substantiate it.2012-12-27