2
$\begingroup$

In state space (SS) theory, the concept algebraic equivalence means, for matrix $A\in\mathbb{R}^{n\times n}$, column vector $B\in \mathbb{R}^n$ and row vector $C\in \mathbb{R}^n$, there is an invertible matrix $T$ such that \begin{align} \bar{A}&=TAT^{-1}\\ \bar{B}&=TB \tag{1}\label{eqtrue}\\ \bar{C}&=CT^{-1} \end{align} A method of solving this problem is provided as follows:

Solve $\begin{bmatrix}\bar{B} & \bar{A}\bar{B} & ... & \bar{A}^{n-1}\bar{B}\end{bmatrix} = T \begin{bmatrix}B & AB & ... & A^{n-1}B\end{bmatrix}$ for $T$ (under the assumption that $\begin{bmatrix}B & ... &A^{n-1}\end{bmatrix}$ and $\begin{bmatrix}\bar{B} & ...\end{bmatrix}$ are both full rank). Verify above relations using the $T$ obtained, that is, one system can be transformed in the other using $T$.


However, seeing as the next chapter discussed similarity transforms to a Jordan form, I had an itching feeling that maybe this could also be done using eigenvalues. I conjectured that maybe the following also implies algebraic equivalence:

\begin{equation} \mathrm{eig}\left(\left[\begin{array}{c|c} A & B\\\hline C & 0 \end{array} \right]\right) = \mathrm{eig}\left(\left[\begin{array}{c|c} \bar{A} & \bar{B}\\\hline \bar{C} & 0 \end{array} \right]\right)\tag{2}\label{eq1} \end{equation}

I.e., the set of eigenvalues of a matrix composed of writing $A,B,C$ in a square matrix (let's call it $SS$) equals the set of eigenvalues of the matrix composed of $\bar{A},\bar{B},\bar{C}$ (let's call that one $\bar{SS}$). For the examples I could find, this held up, but I fear that this may just be due to how these examples are 'cooked up'; and indeed, if this method would work, I should expect that we would be taught this method. I would love to find out how I could (dis)prove this, but I'm not sure how. I'm thinking along the lines of:

Construct a matrix $T^*$ as follows: \begin{align}T^*&=\left[\begin{array}{c|c} T & 0\\\hline 0 & 1 \end{array} \right]& \to (T^*)^{-1}&=\left[\begin{array}{c|c} T^{-1} & 0\\\hline 0 & 1 \end{array} \right]\end{align}

Then, we get \begin{equation} T^*\left[\begin{array}{c|c} A & B\\\hline C & 0 \end{array} \right](T^*)^{-1} = \left[\begin{array}{c|c} TAT^{-1} & TB\\\hline CT^{-1} & 0 \end{array} \right] \overset{?}= \left[\begin{array}{c|c} \bar{A} & \bar{B}\\\hline \bar{C} & 0 \end{array} \right] \end{equation}

Now, we need to see if there is a $T^*$ such that the above holds. From here on I'm speculating: I figured that there would be such a $T^*$ iff both matrices $SS$ and $\bar{SS}$ could be brought into the same diagonal form, so in other words, if \eqref{eq1} holds.

Furthermore, the transformation matrix should have the same structure as $T^*$. So, let $M$ be the matrix bringing $SS$ into diagonal form, and $\bar{M}$ the matrix bringing $\bar{SS}$ into diagonal form, $M\bar{M}^{-1}$ should have zeroes on the last column and row and a final $1$ on the diagonal. But, here I'm stuck; how could I prove such a thing? Or do I even need to prove the same structure, or is condition \eqref{eq1} already sufficient, and will the desired structure follow naturally?

How can I (dis)prove that the above relation \eqref{eq1} implies algebraic equivalence as in \eqref{eqtrue}?

1 Answers 1

1

To address your main point: no, it is not enough for the eigenvalues to match. In particular: the system with $$ A = \pmatrix{0&1\\0&0}, \quad B^T = C = \pmatrix{0&0} $$ produces a matrix whose eigenvalues are all $0$. However, we would get the same result with $A = 0$.


However, I have a feeling that it is indeed enough to check whether the two partitioned matrices $M = \left[ \begin{smallmatrix} A&B\\C&0 \end{smallmatrix}\right]$ are similar. Also, it's interesting to note that the lower-right entry could be anything (as long as we're consistent), such as $1$ instead of $0$.


Observation: if we take $T_0 = [B \quad AB \quad \cdots \quad A^{n-1}B]$, then we find that $$ \pmatrix{T_0\\&1}^{-1} \pmatrix{A&B\\C&0} \pmatrix{T_0\\&1} = \pmatrix{\bar A & \bar B\\ \bar C & 0} $$ where $\bar A$ is the transpose of that from the associated controllable canonical form, and $\bar B = (1,0,\dots,0)^T$.

  • 1
    I realize now I forgot to mention that, for such a $T$ to be defined uniquely, $\begin{bmatrix}B & AB & ... & A^{n-1}\end{bmatrix}$ should be full rank. Under that assumption, do you agree with my hunch that similarity of the partitioned matrices should be enough?2017-01-24
  • 0
    However, thank you for your insights so far. Indeed, $D$ (the lower-right entry) should be allowed to take any value (as long as it equals $\bar{D}$), so it's very nice to see that that part holds up as well.2017-01-24
  • 0
    I'll take a look at this again this evening, when I get the chance. That's a very interesting assumption to make, and it should make it so that eigenvalues are sufficient after all, assuming the similarity of the partitioned matrices turns out to work as we think.2017-01-24
  • 0
    The existence of such a vector $B$ tells us that the Jordan blocks in the Jordan form of $A$ each have the maximum possible size (i.e. one block per eigenvalue).2017-01-24
  • 1
    Note to self: we can take $\bar A$ to be a companion matrix as an intermediate step2017-01-24
  • 0
    Re. note to self: I've never heard of such a matrix, but it looks eerily similar to the controllability and observability canonical form taught in this course, which should indeed exist given the condition in my first comment (in addition, $\begin{bmatrix}C & CA& ... &CA^{n-1}\end{bmatrix}$ is full rank, if it helps)2017-01-24
  • 0
    See my latest edit.2017-01-24