Suppose $A$ is a real, diagonalizable matrix with two dominant eigenvalues, say $\lambda_1 = \lambda_2 > \lambda_3 \ge \cdots \ge \lambda_n$. I want to generalize the power method to find the two distinct eigenvectors of $\lambda_1$.
If I pick an initial $x$, then I can write $x = \sum_{i=1}^n \langle x, x_i \rangle x_i$ since the eigenvectors $(x_i)_{i=1}^n$ form an orthonormal basis. It follows that \begin{align*} A^k x &=\sum_{i=1}^n \langle x,x_i \rangle \lambda_i^k x_i \\ &= \lambda_1^k \left[ \langle x,x_1 \rangle x_1 + \langle x,x_2 \rangle x_2 + \sum_{j=3}^n \frac{ \lambda_j^k}{\lambda_1^k} \langle x, x_j \rangle x_j \right] \\ &\approx \lambda_1^k \left[ \langle x,x_1 \rangle x_1 + \langle x, x_2 \rangle x_2 \right] \quad (\text{large } k) \end{align*} So unlike the case where I have a single dominant eigenvalue, this converges to the projection of the inital vector on the eigenspace of $\lambda_1$. How can I recover $x_1$ and $x_2$ from some modification here?