$$ \begin{bmatrix}1&2&3\\4&5&6\\7&8&9\end{bmatrix}X = \begin{bmatrix}3&2&1\\6&5&4\\9&8&7\end{bmatrix}. $$ I know how to do this normally, but I'm confused with this case. It seems like an inverse does not exist for the matrices, and the second is the first matrix in reverse.
Solving for a matrix
-
1Indeed, that matrix is not invertible. Note that the 3rd column is twice the second, minus the first. – 2017-02-25
-
0So is there a way to solve it? I was told to express the entries in terms of the free variables. – 2017-02-25
2 Answers
Recall that, when we multiply matrices $A$ and $B$, each column of $AB$ is a linear combination of the columns of $A$, with weights given by entries in the corresponding column of B.
In this case, you want the first column of $X$ to be something that says, "ignore the first two columns of $A$, and take 1 times the third. That column should be $\left[\begin{matrix} 0 \\ 0 \\ 1 \end{matrix}\right]$. Can you take it from there?
[EDIT]
This solution won't be unique, since, as we've noted, the columns of $A$ are linearly dependent, so there's more than one way to get column 3.
[EDIT]
To obtain a general solution, we could work with the augmented matrix $[A|AX]$, and see what happens:
$\begin{align}\left[\begin{array}{ccc|ccc}1 & 2 & 3 & 3 & 2 & 1 \\ 4 & 5 & 6 & 6 & 5 & 4 \\ 7 & 8 & 9 & 9 & 8 & 7\end{array}\right] &\sim \left[\begin{array}{ccc|ccc}1 & 2 & 3 & 3 & 2 & 1 \\ 4 & 5 & 6 & 6 & 5 & 4 \\ 0 & 0 & 0 & 0 & 0 & 0\end{array}\right] \\ &\sim \left[\begin{array}{ccc|ccc}1 & 2 & 3 & 3 & 2 & 1 \\ 0 & 1 & 2 & 2 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0\end{array}\right] \\ &\sim \left[\begin{array}{ccc|ccc}1 & 0 & -1 & -1 & 0 & 1 \\ 0 & 1 & 2 & 2 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0\end{array}\right]\end{align}$
Now we just have to interpret this result. For each column we get a free variable. Call one free variable $t_1$. Then the first column of $X$ is $\left[\begin{matrix}-1+t_1 \\ 2- 2t_1 \\ t_1 \end{matrix}\right]$. Similarly, the second column is $\left[\begin{matrix}t_2 \\ 1- 2t_2 \\ t_2 \end{matrix}\right]$.
Do you see how this is working?
-
0Is there a way to express the entries in terms of free variables? – 2017-02-25
-
0Yeah, there must be. Gimme a minute. – 2017-02-25
-
0Answer updated. – 2017-02-25
-
0I'm confused how you're interpreting the last matrix. And wouldn't there be 2 free variables? – 2017-02-25
-
0If it helps, just consider one column at a time on the right. Obtaining the three columns consists of three totally independent problems. In each case, the matrix on the left has got two columns with pivots, and one without. Thus, for each of the three independent problems, there are two bound variables and one free variable. – 2017-02-25
-
0I'm still not understanding the process you're using for each of these independent problems to find the columns. I understand the left matrix has two pivot columns and one without, but I don;t understand what you're doing with the individual columns on the right. – 2017-02-25
-
1Ok, just look at the three left columns, and the first one on the right. Call the variables $x$ $y$ and $z$ (these are the three entries in column 1 of $X$), so $z$ is free. The first row represents the equation $x-z=-1$, hence, $x=-1+z$. I was calling it $t_1$ instead of $z$. The second row tells us similarly that $y=2-2z$. Is that clearer? – 2017-02-25
-
0I got it now. Thanks for the help – 2017-02-25
-
0Why complicate the 1st column? – 2017-02-25
-
0What do you mean? – 2017-02-25
-
0You picked a particular solution that is not a permutation matrix. – 2017-02-25
-
0Um... I found the general solution. You get the permutation matrix solution by taking $t_1=1$ (and $t_2=t_3=0)$. I wrote it that way because that's the most natural way to read it out of the reduced matrix. – 2017-02-25
-
0Yes, but the RREF ruined it. Visual inspection would suggest another particular solution, namely, the permutation matrix with ones on the skew-diagonal. – 2017-02-25
-
0Yeah, that was my initial answer, but then the OP asked how to get there using free variables. If you don't see the inspection answer, you want a method. – 2017-02-25
-
0I guess you could also work out the free part by inspection, noting that $C3 = 2C2 - C1$, but that's assuming a good bit of matrix sense and intuition. – 2017-02-25
We have a linear matrix equation in $\mathrm X \in \mathbb R^{3 \times 3}$
$$\mathrm A \mathrm X = \mathrm B$$
where
$$\mathrm A = \begin{bmatrix} 1&2&3\\ 4&5&6\\ 7&8&9\end{bmatrix} \qquad \qquad \qquad \mathrm B = \begin{bmatrix} 3&2&1\\ 6&5&4\\ 9&8&7\end{bmatrix}$$
Visual inspection tells us that a particular solution, $\mathrm X_p$, is a permutation matrix. What is the null space of $\mathrm A$? Using SymPy:
>>> from sympy import *
>>> A = Matrix([[1,2,3],
[4,5,6],
[7,8,9]])
>>> A.nullspace()
[Matrix([
[ 1],
[-2],
[ 1]])]
Hence, the $1$-dimensional null space of $\mathrm A$ is spanned by
$$\mathrm v := \begin{bmatrix} 1\\ -2\\ 1\end{bmatrix}$$
and the solution set is a $3$-dimensional affine matrix space parametrized as follows
$$\left\{ \mathrm X_p + \mathrm v \eta^{\top} : \eta \in \mathbb R^3 \right\} = \Bigg\{ \mathrm X_p + \begin{bmatrix} \eta_1 & \eta_2 & \eta_3\\ -2 \eta_1 & -2 \eta_2 & -2 \eta_3\\ \eta_1 & \eta_2 & \eta_3\end{bmatrix} : \eta \in \mathbb R^3 \Bigg\}$$
Note that
$$\mathrm A (\mathrm X_p + \mathrm v \eta^{\top}) = \underbrace{ \mathrm A \mathrm X_p }_{= \mathrm B} + \underbrace{\mathrm A \mathrm v}_{= 0_3} \eta^{\top} = \mathrm B + \mathrm O_3 = \mathrm B$$