By using Cayley Hamilton Theorem, we can express a $n \times n$ invertible matrix $A$ as \begin{align} A^{-1} = c_{n-1} A^{n-1} + c_{n-2} A^{n-2} + \cdots + c_{1} A + c_{0} I \end{align} Given that $(A^{-1})_{ij} = 0$ for some $i \neq j$, \begin{align} c_{n-1} (A^{n-1})_{ij} + c_{n-2} (A^{n-2})_{ij} + \cdots + c_{1} A_{ij} = 0 \tag{1} \end{align} I wonder when would equation (1) implies \begin{align} A_{ij} = (A^{2})_{ij} = \cdots = (A^{n-1})_{ij} = 0 \qquad \mathrm{for} \; i \neq j \tag{2} \end{align}
Equation (2) satisfies equation (1). But I wonder if it is a unique solution to equation (1).
To prove equation (2) as a unique solution, I thought that it is related to linear independence. However the coefficient $c_{k}$ also depends on the eigenvalues of $A$, so I am not sure how to tackle this proof. Thanks.