4
$\begingroup$

I'm trying to show:

Let $C(t)=C_rt^r+C_{r-1}t^{r-1}+\cdots+C_1t^1+C_0\in \mathcal{M}_n(\mathbb{F}[t])$ a polynomial with coefficients $C_{i}$ in $\mathcal{M}_n(\mathbb{F})$. Show that, exists a matrix polynomial $Q(t)$ such that: $C(t)=Q(t)(tI_n-A)+C(A)$ That means, the remainder of the division 'to the right' of $C(t)$ by $(tI_n-A)$ is the matrix: $C(A)=C_rA^r+C_{r-1}A^{r-1}+\cdots+C_1A^1+C_0$

Actually, I'm a little confused with this exercise.

It says nothing about $(tI_n-A)$.

Thanks for your help.

  • 0
    Compute $C(t)-C(A)$ using identities like $t^kI-A^k=(tI-A)\ldots$.2012-04-08

1 Answers 1

2

You can do this by thinking of matrices with polynomials as entries as polynomials with matrices as coefficients (your expresion for $C(t)$ already suggests this point of view), by taking the coefficient of $t^i$ to be the matrix of the coefficients of $t^i$ of your polynomial entries. Be careful, those are polynomials with non-commutative coefficients, and not everything that works for commutative polynomials works for them, notably one cannot just evaluate them (in a matrix); this means your expression $C(A)$ does not have any sense a priori. For now I'll call it $R$ (for remainder) instead and just assume it to be some constant matrix; I'll show that it is given by your expression later. Euclidean right-division by a monic polynomial such as $I_nt-A$ turns out to work as usual (here monic means the leading coefficient is the identity matrix $I_n$). (Left-division works too, but the answers may differ.)

Here's how it works in detail. Since $R$ is a constant (no $t$ is present), the leading term $C_rt^r$ can only come (where I'm assuming r>0) from the product $Q(t)(I_nt-A)$; this forces $Q(t)$ to have leading term $C_rt^{r-1}$. Denoting by $Q_1(t)$ the remaining terms of $Q(t)$ it is easy to see they have to satisfy $ C(t)-C_rt^r+C_rAt^{r-1}=Q_1(t)(I_nt-A)+R $ (note that I rewrote $C_rt^{r-1}A$ as $C_rAt^{r-1}$; this commutation of powers of $t$ with constant matrices is obvious when interpreting these as matrices with polynomial entries).

The left hand side is a polynomial of degree at most $r-1$, so we may assume by induction on $r$ that we know how to handle the Euclidean division for it. The base case of the induction which I skipped over is easy: if the left hand side $C(t)$ is constant, the obviously $Q(t)=0$ and $R=C(t)$ (which we may write as $C(A)$ if we like since $C(t)$ does not contain $t$ anyway). So if we put $C_1(t)=C(t)-C_rt^r+C_rAt^{r-1}$ and assume by induction that there are unique $Q_1(t)$ and constant $R$ such that $C_1(t)=Q_1(t)(I_nt-A)+R$, then we have shown that there also exists a unique pair for $C(t)$, namely $Q(t)=C_rt^{r-1}+Q_1(t)$ and the same $R$ as for $C_1(t)$.

All that remains is showing that $R=C(A)$ as given by the expression in the question (which is called the right-evaluation of $C(t)$ at $A$). We've seen that this is trivially true if $C(t)$ is constant, so we may assume it holds for $C_1(t)$, in other words $C_1(t)=Q_1(t)(I_nt-A)+C_1(A)$. Now $C_1(t)$ was obtained from $C(t)$ by removing the leading term $C_rt^r$ and adding the term $C_rAt^{r-1}$ in its place; it is immediate that the contribution of those two terms to the value of $C(A)$ is the same, and therefore $C(A)=C_1(A)$. Thus one finds $R=C(A)$ as promised.