Can someone answer how can I find the matrix $A$ from the matrix equation:$$A^+BA=C$$ where $B$ and $C$ are known square matrices, $A^+$ denotes the Hermitian conjugate, and we are given a constraint: $\det(A)=a$ where $a$ is a known constant.
Matrix equation with a constraint
-
0I think we need some information on $B$ and $C$. The only thing I can say is that each eigenvalue $c_k$ of $C$ should match one eiegenvalue $b_m$ of $B$ times $a^2$. – 2012-02-16
-
0Using $(XY)^+ = Y^+ Y^+$ and $(X^+)^+ = X$. We have: $BA = (A^+)^{-1} C$, or $(A^+)^{-1} = BAC^{-1}$. Also $A^+ B = CA^{-1}$, and $B^+ A = (A^+)^{-1} C^+$. Few more steps... we get $$A = B^{-1} B^{+} A (C^{+})^{-1} C$$. Let $E = B^{-1} B^{+}$ and $D = (C^{+})^{-1} C$ (you can compute both $E,D$), we are left with solving $$A = E A D$$ where both $E,D$ are known. Not sure how to solve it though. – 2012-02-16
-
0@J.D: this doesn't take into account the constrain. – 2012-02-16
-
0@J.D. you are also assuming that $A,B,C$ are nonsingular, which is not an assumption in the original question. – 2012-02-16
-
0In the generality the question is asked, I think it might be impossible to give an answer. For instance, if $B=C=0$, then any matrix $A$ with $\det(A)=a$ will be a solution. Other choices of $B,C$ will yield many varying choices for $A$ (including none if $\det(C)\ne\det(B)\det(A)^2$). – 2012-02-16
2 Answers
Using J.D.'s result, I think it's possible to solve it via Superoperator formalism: $$ A=EAD \Rightarrow {\rm vec}A = (D^T\otimes E){\rm vec}A, $$ where ${\rm vec}A$ is a vector, with all coloumns stacked. So $A$ is an eigenvector of $(D^T\otimes E)$ with eigenvalue $+1$.
-
0do you have a reference for this "superoperator" methods? I'd like to read more on that. – 2012-02-16
-
0@J.D. I hate to say, that once a had a very nice paper, which is burried somewhere between my sheets. If I ever find it, I'll leave a comment. It is used in relation to quantum dynamics, of spin systems in my case (some time ago). – 2012-02-16
-
2This is just standard usage of the [Kronecker product](http://en.wikipedia.org/wiki/Kronecker_product) (scroll down to the section named _Matrix equations_.) – 2012-02-16
-
1@Calle Thanks, and the Horn & Johnson should be the place to look for more. – 2012-02-16
First, J.D.'s result $$A = B^{-1} B^{+} A (C^{+})^{-1} C$$ is valid for both singular and non-singular matrices if you use the Moore-Penrose inverse in place of the regular inverse.
Then, as suggested by draks, the Kronecker-Vec formalism can be employed to find an eigenvector $\vec{a} = {\rm vec}A$ associated with the +1 eigenvalue (if such an eigenvalue exists).
Finally, column un-stacking of $\vec{a}$ yields $A$, which now only needs to be multiplied by an appropriate scalar to satisfy the constaint on $det(A)$.
As noted by Martin, the value of this constraint must itself satisfy a constraint: $det(A) = \sqrt{det(C)/det(B)}$