2
$\begingroup$

It's pretty clear to me that all vectors are matrices (either 1 x n or n x 1). But there is some discussion about this at work. We're wondering whether the expression "Vectors/Matrices" in a software GUI can be shortened to "Matrices". (The shorter the expression, the better, because of space restrictions in the GUI.)

So, is every vector a matrix or is there some subtlety in the definitions that I don't know about?

  • 1
    You might want to consider your target audience here. If the users of your application really no clue what "vectors" or "matrices" actually are, you might be even better off by simply using the term "data" or some other such generic term. The distinction between the two terms, if any, is superfluous to the uninitiated.2011-06-14

4 Answers 4

-1

to my knowledge all vectors are either nx1 or 1xm matrices.

  • 4
    In more advanced meaning, "vector" means "element of a vector space", which may *or may not* be arrays of numbers. After all, this is a MATH board, not a PROGRAMMING board.2011-06-14
17

Every finite dimensional vector space of dimension $n$ is isomorphic to a vector space in which the vectors are $n$-tuples (whether written as "row vectors", $1\times n$ 'matrices', or "column vectors", $n\times 1$ matrices). That is, under a suitable interpretation, you can think of them as vectors.

However,

  • It is often useful to think of the vectors differently for specific vector spaces; and
  • For infinite dimensional vector spaces, you cannot do it unless you extend your definition of "matrix" to include matrices that have an infinite number or rows or an infinite number of columns.

For example: consider the vector space of all polynomials of degree at most $3$ with real coefficients, with the usual polynomial addition for vector addition, and usual multiplication by real numbers as the scalar multiplication. This is a vector space in which the vectors are not matrices: they are polynomials of the form $a+bx+cx^2+dx^3$ with $a,b,c,d\in\mathbb{R}$. However, you can "think of them" as tuples $(a,b,c,d)$ because the vector operations occur "componentwise" in any case: when you add two polynomials, $a+bx+cx^2+dx^3$ and $r+sx+tx^2+ux^3$, you get $(a+r) + (b+s)x + (c+t)x^2+(d+u)x^3$; when you multiply by scalars, again each coefficient gets multiplied independently. That is, there is what we call an "isomorphism" between the vector space of all polynomials of degree at most $3$ and the vector space $\mathbb{R}^4$ that makes the polynomial $a+bx+cx^2+dx^3$ correspond to the tuple $(a,b,c,d)$; under this correspondence, the image of a sum is the sum of the images, and the image of a scalar multiple is the scalar multiple of the image (it is what we call a linear transformation, and it is both one-to-one and onto).

However, even though we can think of the polynomials as $4$-tuples (by doing the "translation" described above), there are situations where it is far more useful to think of the vectors as polynomials. For example, the function that takes a polynomial $p(x) = a+bx+cx^2+dx^3$ as an input and returns $p(3)$ as the output is a very nice function: we have that $(p+q)(3) = p(3)+q(3)$ and $(\alpha p)(3) = \alpha(p(3))$ (again, a linear transformation). It is very easy to think of it in terms of polynomials, but it is more annoying to think of it in terms of tuples. Or consider the function that sends a polynomial $p(x)$ to the polynomial you get by taking p(x) + 3p'(x) + 2p''(5) (derivatives). Again, easy to handle as polynomials, more annoying as tuples (but certainly possible).

Or consider the collection of all continuous, differentiable functions $f\colon \mathbb{R}\to\mathbb{R}$ that satisfy the following differential equation: xf''(x) + \cos(x)f'(x) + 2f(x) = 0. The collection of all such functions forms a vector space under the usual addition and scalar multiplication of functions; but even though it is isomorphic to a vector space of tuples (in fact, to $\mathbb{R}^2$), it's better to think of it in terms of functions than in terms of tuples for purposes of actually dealing with the differential equation.

Going to point 2 above: consider the vector space of all polynomials with the usual polynomial addition. Unless you want to allow infinite tuples (which you can, but they tend to be difficult to work with), you cannot represent arbitrary polynomials with $1\times n$ matrices and still keep represent all polynomials. Or to make things even worse: consider the vector space $V$ of all functions $f\colon \mathbb{R}\to\mathbb{R}$, with pointwise addition and scalar multiplication; you would need a tuple that has an many entries as there are real numbers to represent it as a "matrix". It is far better to think of the vectors as functions, not as row or column vectors.

So, as long as you stick to finite dimensional vector spaces, the answer is "up to suitable translation, yes, you can think of all vectors as tuples". However, there are in general lots of translations and no good reason to prefer one over the other, so that you would need to be careful to specify in each case how you are translating. And in many situations, it is better to think of the vectors as they are given rather than as tuples, because the kinds of things you want to do with them are easier to do if they are polynomials/functions/other kinds of objects than if they are tuples.

  • 0
    @Ryan: "Matrix transformation" is not used widely, so I would say that it only applies to the setting in which David Lay is talking about.2012-07-23
5

An analogous question might be whether all integers are rational numbers. It seems obvious, but if you define rational numbers to be pairs of integers (a,b) with b not zero (where (a,b) = (c,d) iff ad-bc = 0), then what is true is that the integers can be embedded in the rationals (by sending the integer n to (n,1)). The moral is that structures in mathematics can sometimes be represented in different ways, so that the real problem with the question "is a vector a matrix" is the word "is". The answer is no, strictly speaking, as Arturo pointed out, but you probably won't do much damage if you think of one as such.

  • 0
    I like this answer because it shows that the 1 is arbitrary just like the choice of a row or column vector is arbitrary in constructing a vector. Not arbitrary for operations though and that's where it gets interesting. The question also seems analogous to whether a circle is an ellipse. That one is even less interesting until considering some operations and the meaning of "is" as discussed here: https://isocpp.org/wiki/faq/proper-inheritance#circle-ellipse2016-02-27
1

The answer is absolutely no if you consider abstract vector spaces. Then a vector does not even look anything like a row/column vector unless a basis is chosen. In fact, pick your 3 favorite numbers, not all zero. Then for any vector in a 3-dimensional space, you can choose a basis so that in coordinates v = [a b c].

In general, vectors in a vector space aren't even coordinates or row/column coordinate vectors or anything, let alone matrices.

But, if you choose a basis (or your vector space comes with one you choose to use, ie [1 0 0], [0 1 0], [0 0 1] then the question is worth asking, and has a reasonable answer which is covered in the other responses.

One other important point. Say we choose to use row matrices to represent vectors. Then column vectors represent linear forms, or covectors, for example the row matrix [1,2,3] is x=1,y=2,z=3 and [6;5;4] (that's a column vector there :) ) is the linear function f(x,y,z) = 6x + 5y + 4z