I have been working a two part question, I have figured out the first part, but the second part eludes me.
Let $A$ be an $n \times n$ matrix over $\mathbb{C}$ with the property that $A^{2}=A$. I can show that the only eigenvalues $A$ can have are $0$ and $1$. Furthermore, the eigenspace of the eigenvalues, $\lambda = {0,1}$, $E_{\lambda} = \{ \bar{x} \in \mathbb{C} \,:\, A \bar{x}=\lambda \bar{x}\}$ are the direct sum of $\mathbb{C}$, $\mathbb{C}=E_0 \oplus E_1$. I need to show that there is an invertible matrix $\Lambda$ such that $\Lambda^{-1}A\Lambda=\left[ \begin{array}{cccc} I_r & 0 \\ 0 & 0 \\ \end{array}\right]$ for some $0 \leq r \leq n$.
I understand that $\dim(E_1 + E_0)=n$, so in particular, $E_1$ has $r$ basis vectors, WLOG we can take this to be $I_r$, and $E_0$ has $n-r$ basis vectors. But why are these two sets linearly independent? And I get the matrix $\left[ \begin{array}{cccc} I_{r} & 0 \\ 0 & 0 \\ \end{array}\right]$ because of the fact that $1$ is an eigenvalue of multiplicity $r$, and $0$ has multiplicity $n-r$?