2
$\begingroup$

Let $A$ be an $n\times n$ matrix and $\beta$ a constant. Consider the special inhomogeneous equation $\dot x = Ax + p(t)e^{\beta t},$ where $p(t)$ is a vector all whose entries are polynomials. Set $\deg(p(t))=\max_{1\leq j\leq n}\deg(p_j(t))$. I have to show that this equation has a particular solution of the form $q(t)e^{\beta t}$ where $q(t)$ is a polynomial vector with $\deg(q(t))=\deg(p(t))$ if $\beta$ is not an eigenvalue of A and $\deg(q(t))=\deg(p(t))+a$ if $\beta$ is an eigenvalue of algebraic multiplicity $a$.

I am given a hint: To investigate $x(t)=\exp(tA)x_0+\int_0^t \exp((t-s)A)g(s)\,\mathrm{d}s$ using the following fact $ \int p(t)e^{\beta t}=q(t)e^{\beta t}$ where $q(t)$ is a polynomial of degree $\deg(q)=\deg(p)$ if $\beta \neq 0 $ and $\deg(q)=\deg(p)+1$ if $\beta=0$. Whew..

2 Answers 2

1

If $x = y e^{\beta t}$, the differential equation becomes $\dot{y} = (A - \beta I) y + p(t)$ If $\beta$ is not an eigenvalue of $A$, i.e. $A - \beta I$ is invertible, then the linear map $L: y \to \dot{y} - (A - \beta I) y$ is one-to-one, and therefore invertible, on the finite-dimensional vector space $(P_d)^n$ where $P$ is the space of polynomials of degree $\le d$.

If $\beta$ is an eigenvalue of both algebraic and geometric multiplicity $a$, there is an $a$-dimensional space $W$ of constant solutions to $L(y) = 0$, and these are all the polynomial solutions of that equation. Decompose $K^n$ (where $K$ is $\mathbb R$ or $\mathbb C$) as $U \oplus W$, where $U$ is invariant under $A$. Corresponding to this, we can decompose $(P_d)^n = U_d \oplus W_d$, where $U_d$ is the vector space spanned by $u t^j$ for $u \in U$ and $j = 0,1,\ldots, d$ and similarly for $W_d$. $L$ is one-to-one and thus invertible on $U_d$. On $W_d$, $L$ is just the derivative operator.
If $p$ has degree $d$, $Lq = p$ has a solution of degree either $d$ (if $p \in U_d$) or $d+1$ (if $p \notin U_d$).

It's a bit more complicated if the algebraic and geometric multiplicities are different. Consider a Jordan block where the algebraic multiplicity is $a$ but the geometric multiplicity is $1 < a$. Then there are linearly independent vectors $v_1, \ldots, v_a$ where $(A - \beta I) v_1 = 0$ and $(A - \beta I) v_j = -v_{j-1}$ otherwise. Now $L (t^k v_j) = k t^{k-1} v_j + t^k v_{j-1}$ (where we consider $v_0 = 0$). Thus $L$ maps the linear span $M_d$ of $t^k v_j$ with $k + j = d$ onto $M_{d-1}$: in fact $ L\left( \frac{t^{j+1}}{j+1} u_k - \frac{t^{j+2}}{(j+1)(j+2)} u_{k-1} + \frac{t^{j+3}}{(j+1)(j+2)(j+3)} u_{k-2} - \ldots + \frac{t^{j+k}}{j(j+1)\ldots(j+k)} u_1 \right) = t^j u_k $
If $p(t)$ has degree $d$, it is in the span of $M_1, \ldots, M_{d+a}$, so $p(t) = L q(t)$ where $q(t)$ is in the span of $M_1, \ldots, M_{d+a+1}$ and thus has degree at most $d+a$.

The final result should be that if the largest Jordan block for eigenvalue $\beta$ has size $b$, for any $p$ there is a solution $q$ of $p(t) = L q(t)$ where $\deg(q) \le \deg(p) + b$.

1

Something is wrong. If $\beta = 0$ and $A$ is the matrix with all entries $0,$ then the algebraic multiplicity of the eigenvalue $0$ is $n,$ where $A$ is $n$ by $n.$ What you have is then $\dot{q} = p,$ so that $\deg q = 1 + \deg p.$ Same conclusion for $\beta \neq 0$ and $A = \beta I.$

Where did you get this, exactly? It is a bit elaborate for typical homework. What book are you using?

EDIT: Klara, this is a very recent book. If you look on the previous page to this exercise, he says "However, if the inhomogeneous term is of a special form, an asatz might be faster than evaluating the integral in (3.48). See Problem (3.13)."

So, I would say the author changed his mind a bit. The hint you are given does not seem to me to help. Now, I think everything is true for when $\beta$ is not an eigenvalue, and everything except the exact degree is true when $\beta$ is an eigenvalue. What I am going to do is rewrite both $p,q$ in a certain way, as both are vectors of polynomials, that is $ p(t) = \sum_{i=0}^{\deg p} \; t^i P_i, $ where each $P_i$ is a column vector of scalars in either $\mathbb R, \mathbb C,$ the field is not going to matter. Same with $ q(t) = \sum_{j=0}^{\deg q} \; t^j Q_j. $ The point is that multiplication by the matrix $A$ commutes with multiplication by the indeterminate $t$ or any $t^k.$ So we may ask ourselves what $A$ does to each $Q_j.$

Anyway, I will finish this up and then probably write to Gerald Teshl. It strikes me as unlikely that he wants solutions to his questions on the web, but he might be grateful to hear of errata. If it turns out his question is entirely correct, which I do not currently expect, I will think about what to do.

  • 0
    @RobertIsrael, thanks. I'll send that along to Gerald.2012-11-28