The solution to the initial value problem $x'(t)=Ax(t)+g(t)\quad\text{with}\quad x(0)=x_0$ is $x(t)=\exp(tA)x_0+\int_0^t \exp((t-s)A)g(s)\,\mathrm{d}s$
Suppose that all eigenvalues of $A$, satisfy $\mathrm{Re}(\alpha_j)<0$. I have to find $\lim_{t\to\infty} x(t)$ when $\lim_{t\to\infty} |g(t)|=g_0$
I saw that the solution is $x(t)=\exp(tA)x_0+\exp(tA)g_0$ I get the same answer that it goes to zero just like in the case when $\lim_{t\to\infty} |g(t)|=0$ Is this right? If the first term goes to zero why doesn't the second as well?
This question is not answered in the dublicate. Thank you for your help.
Klara