2
$\begingroup$

For each positive integer $k$, find the smallest number $n_k$ for which there exist real $n_k$ by $ n_k$ matrices $A_1; A_2; ....; A_k$ such that all of the following conditions hold:

$ \text{ 1. } A_1^2 = A_2^2 = ... = A_k^2 = 0 $

$ \text{ 2. } A_jA_i = A_iA_j \forall1 \leq i \text{ and } j \leq k$

$ \text{ 3. } A_1A_2 .... A_k \neq 0 $

  • 0
    This seems to be Problem 5 from the 2007 International Mathematics Competition for University Students (IMC2007); see [ref.](http://www.imc-math.org.uk/imc2007/day2_solutions.pdf) [*spoiler alert*].2012-12-04

1 Answers 1

1

(I find this question very interesting. I couldn't come up with a solution, but below are some ideas that maybe someone can use/improve)

Fix $k$. From 2, we know that the matrices $A_1,\ldots,A_k$ are mutually commuting. So they can be triangularized simultaneously. This means that we can assume right off the bat that $A_1,\ldots,A_k$ are upper triangular.

Claim 1: the matrices $\prod_{j=n}^mA_nA_{n+1}\cdots A_m$, $1\leq n\leq m\leq k$, are linearly independent.

Claim 2: the set of matrices in Claim 1 has cardinality $2^k-1$.

These two claims together imply that we have $2^k-1$ mutually commuting linearly independent upper triangular matrices with null square. The dimension of the space of such $n_k\times n_k$ matrices is certainly less than $n_k-1$ (not that I have a proof, so this is a missing step too). So we would have $n_k-1\geq 2^k-1$, i.e. $n_k\geq 2^k$.

The challenge would be, assuming the estimate above is sharp, to construct $A_1,\ldots,A_k$ of size $2^k\times 2^k$ satisfying the required conditions.

Proof of Claim 2: we can write our products as $\prod_{j=1}^kA_1^{s_1}\cdots A_k^{s_k}$, where $s_1,\ldots,s_k\in\{0,1\}$. We have $2^k-1$ choices for $s_1,\ldots,s_k$ (as we don't allow them to be all zero). Note that, since $A_1\cdots A_k\ne0$, any partial product cannot be zero either.

Proof of Claim 1: suppose $ \sum_{s\in\{0,1\}^n}\lambda_s\prod_{j=1}^kA_1^{s_1}\cdots A_k^{s_k}=0. $ If we multiply by $A_2\cdots A_k$, all the terms containing any of $A_2,\ldots,A_k$ vanish. This shows that the coefficient of $A_1$ is zero. Similarly, all coefficients of terms of degree $1$ vanish. Now we multiply by $A_3\cdots A_k$ and the only term remaining will be $\lambda_{1,1,0,\ldots,0}A_1A_2=0$; so this coefficient is zero and similarly we can show that all coefficients corresponding to terms of degree two are zero. Going on with this process we can deduce that all $\lambda_s$ are zero, and we get linear independence.