(I find this question very interesting. I couldn't come up with a solution, but below are some ideas that maybe someone can use/improve)
Fix $k$. From 2, we know that the matrices $A_1,\ldots,A_k$ are mutually commuting. So they can be triangularized simultaneously. This means that we can assume right off the bat that $A_1,\ldots,A_k$ are upper triangular.
Claim 1: the matrices $\prod_{j=n}^mA_nA_{n+1}\cdots A_m$, $1\leq n\leq m\leq k$, are linearly independent.
Claim 2: the set of matrices in Claim 1 has cardinality $2^k-1$.
These two claims together imply that we have $2^k-1$ mutually commuting linearly independent upper triangular matrices with null square. The dimension of the space of such $n_k\times n_k$ matrices is certainly less than $n_k-1$ (not that I have a proof, so this is a missing step too). So we would have $n_k-1\geq 2^k-1$, i.e. $n_k\geq 2^k$.
The challenge would be, assuming the estimate above is sharp, to construct $A_1,\ldots,A_k$ of size $2^k\times 2^k$ satisfying the required conditions.
Proof of Claim 2: we can write our products as $\prod_{j=1}^kA_1^{s_1}\cdots A_k^{s_k}$, where $s_1,\ldots,s_k\in\{0,1\}$. We have $2^k-1$ choices for $s_1,\ldots,s_k$ (as we don't allow them to be all zero). Note that, since $A_1\cdots A_k\ne0$, any partial product cannot be zero either.
Proof of Claim 1: suppose $ \sum_{s\in\{0,1\}^n}\lambda_s\prod_{j=1}^kA_1^{s_1}\cdots A_k^{s_k}=0. $ If we multiply by $A_2\cdots A_k$, all the terms containing any of $A_2,\ldots,A_k$ vanish. This shows that the coefficient of $A_1$ is zero. Similarly, all coefficients of terms of degree $1$ vanish. Now we multiply by $A_3\cdots A_k$ and the only term remaining will be $\lambda_{1,1,0,\ldots,0}A_1A_2=0$; so this coefficient is zero and similarly we can show that all coefficients corresponding to terms of degree two are zero. Going on with this process we can deduce that all $\lambda_s$ are zero, and we get linear independence.