say I have a $n\times n$ matrix $w_{ij}$. I can preform a singular value decomposition such that $w_{ij}=\sum_l \sum_n u_{il}\lambda_{ln}v_{jn}$ with $\lambda_{ln}$ diagonal. Now, is there such a generalization so that, given a function of two variables $w(\theta_1,\theta_2)$ such that $$ w(\theta_1 ,\theta_2)=\int dy \int dx \,u(\theta_1 ,x) \, \lambda(x,y) \, v(\theta_2 ,y) $$ where $\lambda$ plays a similar role like it did in the SVD? For instance, say I have the following $$ \exp {[\alpha \cos(\theta-\phi)]} $$ is it possible to find a decomposition such that $$ \exp {\alpha \cos(\theta-\phi)}=\int\int dx \, dy \, u(\theta,x) \, \lambda(\alpha,x,y) \, v(\phi,y) $$ Thanks.
Singular Value Decomposition for Continuous Variables
-
2Probably the more natural approach to the generalization would be to try to identify the spectral decomposition of $w^* w$ and of $w w^*$, where $w^*$ is the adjoint of $w$. Then the "eigenvector matrix" for the second one is your first factor, the "eigenvector matrix" for the first one is your third factor, and the "singular values matrix" should be straightforward to construct from the diagonal operator. – 2015-02-02
-
1To do this, one would have to start with a suitable functionalanalytic framework, e.g. assume that $w$ is square integrable over something like $[a,b] \times [c,d]$. As suggested in the previous comment, set $w_1(x,z) = \int_c^d w(x,s)w(z,s) \, ds$ and $w_2(t,y) = \int_a^b w(s,y)w(s,t) ds$. These are symmetric and positive definite integral kernels and therefore can be written as $w_1(x,z) = \sum_{j=1}^\infty \alpha_j \phi_j(x) \phi_j(z), w_2(y,t) = \sum \alpha_j \psi_j(y) \psi_j(t)$. Then it should be possible to show that $w(x,y) = \sum \sqrt{\alpha_j} \phi_j(x) \psi_j(y)$. – 2015-02-08
-
0Thank you both for your comments! I will look into this idea, as it's a natural extension from the matrix definition. – 2015-02-10
1 Answers
Just view a matrix as a sampling of a continous variable. I.e. the matrix indices are local integrals in some sense. Assume you can sample the function at arbitrary small intervals ( perform local integrals / mean value integrals ). (That is practically what the riemann integral is, no?).
At every given resolution you can make an SVD of the matrix containing those local integrals. Now consider the Haar wavelet, where you can build orthogonal refinements. (Think of it as splitting every matrix element into blocks of $2 \times 2$ elements where the old element is the mean value of the new elements). Haar wavelet additions (measuring discrete derivatives) will then be either orthogonal or parallell both within the vectors of the matrices themselves and as matrix-blocks. So you can make a successive refinement of the SVD up to any resolution you like, and because of block properties of matrix multiplication, it will be "nice" too in the sense that you can just find successive refinements.
If this is fruitful, it may be used to perform SVDs fast for large matrices, since we can pretend to measure on the matrix itself as if it were a continous variable.
-
0Thanks for the answer, it's interesting. I have a professor in my dept who works with wavelets and should ask him about this construction. – 2015-02-10
-
0Yes you can try, but I'm not sure you need an expert in wavelets for this. It's just linear algebra. The Haar wavelet is probably the easiest wavelet. – 2015-02-10