0
$\begingroup$

Suppose that I have two real-valued matrices $\bf{A}$ and $\bf{B}$. Both matrices are exactly the same size. I multiply both matrices together in a point-by-point fashion similar to the Matlab A .* B operation.

Under what conditions can I approximately separate $\bf{A}$ and $\bf{B}$ using Principle Components Analysis (PCA)? Would it be possible to remove some components of the product A .* B to get an approximation of $\bf{A}$ or $\bf{B}$?

What algorithm might be best suited for this operation?

I am not looking for an exact separation of the matrices, but a separation using some sort of (statistical or numerical?) constraints. How would I set this problem up, and is there a good example of how to do this?

2 Answers 2

2

It seems to me that you can't separate the matrices from their pointwise multiplication (Hadamard/Schur product) without additional constraints.

Consider some matrix C. Any number in C is decomposable into an infinite number of products of two real numbers... which would give you an infinite number of "perfect" decompositions.

For example, you can always decompose C into 1 (a matrix of ones) and C. In fact, for any choice of A you can find a B such that their Schur product will result in any C that is specified (ignoring some problems when zero appears)...

  • 0
    Sure, that sounds good - I will look to see which constraints I can use and open another question.2012-11-11
0

If the entries are non-negative then you could use NMF (non-negative matrix factorization). Or let $\textbf{C} = \textbf{A} \textbf{B}$. Then you could use singular value decomposition on $\textbf{C}$.

  • 0
    It is, but the document addresses a different problem that factorizes a combination of the Schur product with normal multiplication. Also, see my answer.2012-11-10