Given a multiplication of a vector $x$ with a matrix $A$ where $x$ is an eigenvector of $A$ for some eigenvalues is there efficient algorithm for reconstructing $A$?
How difficult is it to find a Matrix A if i give you it’s multiplication with an eigenvector? Assuming my matrix is big enough
-
0You're calling $A$ a matrix in the title and calling $A$ a vector in the question. I suppose it should be matrix both times? – 2012-09-27
3 Answers
A linear map $V\to W$ is determined by the values it takes on any chosen basis of $V$, and on the other hand such values can be taken independently to be any sequence of values in $W$. The matrix representation of linear maps is based on this: the columns of the matrix give the values on a fixed basis of $V$, expressed on another fixed basis of $W$; clearly all columns can be specified independently and will give a matrix (hence a linear map) in all cases.
Now you for some linear map $f:V\to V$ (corresponding to $A$ on some fixed basis) you know that some specific nonzero vector $x\in V$ satisfies $f(v)=\lambda x$ for some specific scalar $\lambda$. What does this tell you about $f$? It specifies the image of just one vector (which image happens to be a multiple of $x$, but that is of little importance to your question). Does this allow you to know $f$? Unless $\dim V=1$ (unlikely that you intended that) the asnwer is obviously "no". You can complete $x$ to a basis of $V$ in some fixed way, and then you are completely free to fix the images by $f$ of these extra basis vectors in any way, and a corresponding $f$ (and matrix $A$) will exist.
So you might as well be asking if given one component of a vector in $\mathbf R^n$ there is an efficient algorithm of determining the other components.
Not sure if I get your question correctly. The matrix $A$ isn't defined completely just because you know one eigenvector, so I do not see how you could "reconstruct" it. In fact you could choose $A=\lambda\cdot I$ and every vector will be an appropriate eigenvector.
Basically you just want to construct some matrix $A$ such that $x$ is an eigenvector of $A$ with eigenvalue $\lambda$? This is easier to do if you set up an eigenbasis containing $x$, so let's define $B = (x, e_2, \ldots, e_n)$. Then you can use
$ A = B\begin{pmatrix} \lambda & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & \ddots \end{pmatrix}B^{-1}$
Let's look at an example. Suppose $A=\pmatrix{a&b\cr c&d\cr}$ is an unknown $2\times2$ matrix and $x=(2,3)$ and all you know about $A$ is that $Ax=2x=(4,6)$. That tells you that you have $2a+3b=4,\qquad2c+3d=6$ That's only 2 equations in 4 unknowns, and there will be a 2-parameter family of solutions $A=\pmatrix{2-3s&2s\cr3-3t&2t\cr}$ "Assuming my matrix is big enough" isn't going to help --- the bigger your matrix is, the bigger the number of free parameters.