4
$\begingroup$

When doing matrix multiplication can I carry a vector to the other side?

For example if I have:

$Ab = c$

where A is m by m invertable matrix, and b is m by 1 col vector, c m by 1. Can I do something like this:

$A = c/b$

And what does that mean...

I just need to find matrix A, as I have b and c vectors.

P.S. Also, I know that inverse(A) is diagonal. If it helps.

  • 2
    It's an understatement to say that knowing that $A$ is diagonal helps a lot.2011-11-28

4 Answers 4

5

Yes, having the inverse of $A$ be diagonal helps immensely!

If the inverse of $A$ is diagonal, $A^{-1} = \left(\begin{array}{ccc} \lambda_1 & \cdots & 0\\ \vdots & \ddots & \vdots\\ 0 & \cdots & \lambda_n,\end{array}\right)$ then $A$ is diagonal with $A = \left(\begin{array}{ccc} \frac{1}{\lambda_1} & \cdots & 0\\ \vdots & \ddots & \vdots\\ 0 & \cdots & \frac{1}{\lambda_n} \end{array}\right).$ That means that if $\mathbf{b}= \left(\begin{array}{c}b_1\\b_2\\\vdots \\b_m\end{array}\right),\qquad \mathbf{c} = \left(\begin{array}{c}c_1\\c_2\\\vdots \\b_m\end{array}\right),$ then you need $\frac{1}{\lambda_i}b_i = c_i.$ For this to be possible, you need $b_i=0$ if and only if $c_i=0$; but if this is the case, then $A$ is determined uniquely in all rows corresponding to nonzero entries of $\mathbf{b}$ and $\mathbf{c}$, and can be any nonzero entry in the rows where $b_i=c_i=0$.

  • 0
    Thanks, turns out I didn't actually need this... I think. But it's good to know.2011-12-05
4

Unfortunately, it is not easy to interpret $c/b$ where $c$ and $b$ are vectors.

Consider the following example: $b = c = \begin{pmatrix}3\\4 \end{pmatrix}$.

It is easy to see that $A = \begin{pmatrix}1 & 0\\0 & 1 \end{pmatrix}$ gives us $Ab = c$.

However note that $\tilde{A} = \begin{pmatrix}0 & \frac34\\\frac43 & 0 \end{pmatrix}$ also gives us $\tilde{A} b = c$.

Hence, $c/b$ is not well-defined if we want to interpret $c/b$ as $A$ such that $A b = c$.

  • 1
    @drozzy: If $A^{-1}$ is diagonal, then so is $A$. If $A$ is diagonal with diagonal entries $(a_1,a_2,\ldots,a_m)$, then $Ab=[a_1b_1,a_2b_2,\ldots,a_mb_m]^\textrm{T}$. Then if all of the entries of $b$ are nonzero, you can determine each $a_k$ with the equation $a_k=c_k/b_k$. So, yes, if $A$ is diagonal **and** $b$ has all nonzero entries, then $A$ is determined by setting the diagonal entries to the entrywise quotients of $c$ by $b$.2011-11-28
4

[Edit: The question was later modified to say that $A$ is diagonal, which changes things quite a bit. This answer is for the general case. See Arturo's answer for the diagonal case.]

Typically, no, and Sivaram has already given an illustrative example. In your situation, the only exception is when $m=1$ and $b\neq 0$. The reason is that if $b$ is an $m$-by-$1$ vector and $m>1$, then $Ab$ never determines $A$ completely. One way to see this is to note that there exists an $m$-by-$m$ matrix $B$ such that $B$ is not the zero matrix, but $Bb=0$. Then $A+B\neq A$, but $(A+B)b=Ab$. Thus, whatever "$c/b$" might mean, it would have to be equally valid that it is equal to $A$ and to $A+B$, which is impossible.

In order for the equation $Ab=c$ to uniquely determine $A$, $b$ must be right invertible. If we generalized a bit to allow $b$ to have other sizes than $m$-by-$1$, say $m$-by-$k$, this will be possible when $b$ has at least as many columns as rows ($k\geq m$) and has maximal rank ($m$). This would correspond to $b$ representing a surjective linear transformation, and then $Ab$ determines $A$ because if you know $Ab$, then you know what $A$ does to everything in the range of $b$, namely everything. Algebraically speaking, if $d$ is a matrix such that $bd=I_m$, the $m$-by-$m$ identity matrix, then $A=AI_m=Abd=cd$. Thus, $cd$ plays the role of "$c/b$". Note however that unless $m=k$, $d$ is not uniquely determined by $b$. If $m=k$, then $d=b^{-1}$ is the inverse matrix of $b$, and $A=cb^{-1}$ looks more like the division you'd hope for. And again, if $k or for any other reason $b$ has rank less than $m$, $d$ does not exist.

0

It might help if you would think about a matrix M that you could right-multiply both sides of your equation: $ (Ab)M = cM $ $ A(bM) = cM $ then, if bM is invertible, $ A=(cM)(bM)^{-1} $