When I solve a problem of the type $x' = Ax, x(0) = x_0$, where $A$ is a matrix and $x$ is a vector, I typically follow this method:
I compute $\det(A-\lambda I)$ and find the roots and their multiplicity, and then using these eigenvalues $\lambda_i$, I find vectors $v_1, v_2, v_3$ that constitute a basis for $\mathbb{R}^3$. And then, usually, there will be some relationship between the vectors and the initial condition, e.g. $x_0 = -v_1 + v_2$, or something similar of course depending on the initial condition and the vectors in question. Then, staying with the example $x_0 = -v_1 + v_2$, I put $x(t) = -e^{tA}v_1 + e^{tA}v_2$, where $e^{tA} = e^{\lambda_i t}$ for the corresponding eigenvalues.
But I recently came across a case where one thing was a little different. In this case $x_0 = v_1 - 2v_2$, so that $x(t) = e^{tA}v_1 - 2e^{tA}v_2$, but then I was supposed to put $e^{tA}v_2 = e^{\lambda t}(I + t(A-\lambda I))v_2$. I have not been able to figure out the reason for this and I'm hoping someone here has an idea of what is going on.