8
$\begingroup$

I've learned in linear algebra class that an $n \times m$ augmented matrix can be thought of as a collection of n planes in $\mathbb {R}^m$ . If the matrix is invertible, the planes all intersect at a single point. If it has infinite solutions, two or more planes coincide and so their intersection is a line rather than a point. If the matrix is inconsistent then there is no one single point of intersection. When you do Gauss-Jordan elimination, you are adding scalar multiples of the planes to each other, which has the effect of rotating them over the line where they intersect. When they attain RREF, they are at right angles to each other in the dimensions corresponding to columns that contain leading ones.

But then we go on to another interpretation of matrices as linear transformations for altering the magnitude and direction of vectors. I would like to have a geometric interpretation of matrix multiplication that is compatible with the intersecting planes interpretation. Since matrix multiplication is built from row by column dot-products, I guess the first step would be to visualize those.

First question: Am I correct in interpreting the dot-product of two vectors as the cosine of the angle between them scaled to the magnitude of both vectors, with a value somewhere between the length of the shadow vector A casts on vector B and the length of vector B. Is there anything more specific I should associate with this quantity?

Second question: Now let's say we multiply a $3 \times 3$ matrix by a vector $\in \mathbb R^3$ and get a different vector $\in \mathbb R^3$. How do I interpret it graphically relative to the three intersecting planes and one line through the origin that gave rise to it?

Third question: Can anybody recommend some software I can use for visualizing matrix operations, especially on Linux? At the moment I'm using wxMaxima with the draw package, but it's really awkward to use because draw cannot take matrices as arguments and I don't see any way of updating and existing plot with new information.

Thank you all kindly.

  • 0
    I'm not sure exactly what you're asking in your first question, but it doesn't seem to me like it could be correct. For example, take $x=10\in \mathbb{R}=\mathbb{R}^1$. The dot product of $x$ with itself is 100, which is 100 times the cosine of the angle 0, not 10 times, and I think the only possible "length of a shadow" in this situation is 10.2011-06-16
  • 0
    I'd also be wary of the interpretation of a matrix as a collection of planes. There is a collection of $n$ planes associated to an $n\times m$ matrix $T$: the planes in $\mathbb{R}^m$ whose normal vectors are the rows of $T$ (provided no row is zero). But the same planes are determined by many different normal vectors, and hence many different matrices. For example, if $n=m=2$ then you get the same pair of planes from any diagonal matrices with non-zero diagonal entries.2011-06-16
  • 0
    @mac: dot prodcut of x with itself is magnitude of $x$ times cos(0) times the magnitude of the other vector $\in R^1$ which happens to also be $x$. So, 10x1x10. So it seems like the dot product collapses neatly into ordinary scalar multiplication in the degenerate case. Regarding the second comment, I forgot to mention that for my purposes I'm assuming all normal vectors to be through the origin.2011-06-16
  • 0
    I was going by [this post](http://math.stackexchange.com/questions/77/understanding-dot-and-cross-product/85#85). Perhaps I'm not interpreting what the author said about the dot product being the part of one vector that's in the direction of the other vector?2011-06-18
  • 0
    I think that author was being a little sloppy. The dot product of $A$ and $B$ is $\it part\ of\ the\ formula$ for the part of $A$ that's in the direction of $B$, the details are in the answer I posted yesterday.2011-06-18

3 Answers 3