I'm trying to solve the following question
Find a $3\times 3$ matrix $A$ such that $Null(A)=span\lbrace \begin{bmatrix}1 \\1 \\1 \\\end{bmatrix},\begin{bmatrix}1\\2\\3\end{bmatrix}\rbrace$.
My attempt was this. Let $A=\begin{bmatrix}a_{11}&a_{12}&a_{13}\\a_{21}&a_{22}&a_{23}\\a_{31}&a_{32}&a_{33} \end{bmatrix}$. Then I solved $A\begin{bmatrix}1 \\1 \\1 \\\end{bmatrix}=\begin{bmatrix}0\\0\\0\end{bmatrix}$ and $A\begin{bmatrix}1\\2\\3\end{bmatrix}=\begin{bmatrix}0\\0\\0\end{bmatrix}$ which gave me the following conditions $a_{12}+2a_{13}$=0 , $a_{22}+2a_{23}=0$ and $a_{32}+2a_{33}=0$. Solving these I can get a matrix that look like this $A=\begin{bmatrix}1&-2&1\\1&-2&1\\1&-2&1\\\end{bmatrix}$.
Is my approach correct? I'm asking because I wanted to check my answer and calculated the null space for the matrix I found above and the null space was spanned by $\lbrace\begin{bmatrix}-1\\0\\1\end{bmatrix},\begin{bmatrix}2\\1\\0\end{bmatrix}\rbrace$.
Find a Matrix with a given null space
-
0Here is a somewhat easier solution: Let $\mathbf{u} := [1,1,1]^\intercal,\mathbf{v} := [1,2,3]^\intercal$. Let $\mathbf{w} := \mathbf{u} \times \mathbf{v}$. Then the covector $\mathbf{w}^\intercal$ has null space precisely equal to $\operatorname{span}(\mathbf{u},\mathbf{v})$. Hence you can fill all three rows of $\mathbf A$ with $\mathbf w$, or just fill one and leave others empty. – 2017-02-11
-
0Your solution is correct and it is no wonder that the null space can ge generated by several different pairs of vectors. – 2017-02-13
3 Answers
Well, $(1,1,1)$ and $(1,2,3)$ are clearly in the null space of the matrix. And you can check that there is a least one vector not in the null space, thus the null space is at most two-dimensional. Then since $(1,1,1)$ and $(1,2,3)$ are linearly independent the null space must be their span.
It is also true that the null space is the span of $(-1,0,1)$ and $(2,1,0).$ This is okay, because their span is the same as the span of $(1,1,1)$ and $(1,2,3).$ Note that $(-1,0,1)+(2,1,0) = (1,1,1)$ and $2*(2,1,0)+3(-1,0,1) = (1,2,3). $
You did everything right. You have $$ \begin{bmatrix} 1\\1\\1\end{bmatrix}= \begin{bmatrix}-1\\0\\1\end{bmatrix}+\begin{bmatrix}2\\1\\0\end{bmatrix} $$ and $$ \begin{bmatrix} 1\\2\\3\end{bmatrix}= 3\,\begin{bmatrix}-1\\0\\1\end{bmatrix}+2\,\begin{bmatrix}2\\1\\0\end{bmatrix}. $$
Perhaps you could extend $\{\mathbf{u},\mathbf{v}\} = \left\{ \left( \begin{matrix} 1 \\ 1 \\ 1 \end{matrix} \right), \left( \begin{matrix} 1 \\ 2 \\ 3 \end{matrix} \right) \right\}$ to a basis by adjoining to it some vector $\mathbf{w}$ not in its span. Then you could just assign $\mathbf{u},\mathbf{v} \to \mathbf{0}$ and $\mathbf{w}$ to something other than zero, and extend linearly. In this way you would be defining a linear transformation $\lambda$ by defining its behavior on a basis $\{\mathbf{u,v,w}\}$.