2
$\begingroup$

I have a spectroscopy problem that boils down to a matrix equation where X*A=C. I take N observations each consisting of 3 detector readings and my detectors suffer from some amount of cross-talk (some percentage of signal from detector 1 spills over into detector 2, etc.) In my specific case, C is an Nx3 matrix of detector readings, X is an Nx2 matrix of my unknown true signals, and A is a 2x3 matrix of constant coefficients that represent how much each signal source gets into each detector. So, I have:

$\begin{bmatrix} X_{1 1} & X_{12} \\ \vdots & \vdots \\ X_{N1} & X_{N2} \end{bmatrix} * \begin{bmatrix} A_{11} & A_{12} & A_{13} \\ A_{21} & A_{22} & A_{23}\end{bmatrix} = \begin{bmatrix} C_{11} & C_{12} & C_{13} \\ \vdots & \vdots & \vdots \\ C_{N1} & C_{N2} & C_{N3}\end{bmatrix}$

This is a system of linear equations with 3N equations and 2N+6 unknowns. When N = 6, this system should be determined, and in real world experimentation with noise contributions, when N > 6, I should begin to compensate for noise. In practice, I will usually take 100's of observations, so N will usually be 200 to 1000. Without noise contributions, of course this would be an overconstrained problem. One column of C does have to be a linear combination of the other two, however, with every C element having a Gaussian noise contribution of mean zero, it is underconstrained and one can only find the best estimates.

Intuitively, this should be solvable for the 6 elements of A and the 2N elements of X, but I cannot find a treatment for this construction of a problem. I have been searching for linear algebra solution approaches that address having two matrices of unknowns as I have described, but I haven't found anything appropriate yet. I can re-arrange the matrix equation to $ X = C * A^{-1} $, but I haven't seen a treatment for that construction either.

Any suggestions or insights into solving this? Thanks in advance.

  • 0
    A non-square matrix can sometimes have a one-sided inverse based on e.g. $AA^T(AA^T)^{-1} = I$2012-04-23

1 Answers 1

1

If the $N \times 3$ matrix $C$ can be represented as $XA$ where $X$ is $N \times 2$ and $A$ is $2 \times 3$, $C$ must have rank at most $2$. Thus one column of $C$ must be a linear combination of the other two. Do row-reduction on $C^T$ to find what linear combination it is. If, say, $C_{i3} = a C_{i1} + b C_{i2}$, that means that $C = X A$ where $X$ consists of the first two columns of $C$ and $A = \pmatrix{1 & 0 & a\cr 0 & 1 & b\cr}$.

The solution is never unique: given one solution, you can always replace $X$ by $XU$ and $A$ by $U^{-1} A$ where $U$ is any invertible $2 \times 2$ matrix.

  • 0
    You can use a Singular Value Decomposition to get a good rank-$2$ approximation to $C$.2012-04-23