Lecture 11
Vector Spaces
The idea of a vector space is a generalization of the concepts we’ve studied so far to other sets of objects.
Let be a set of objects (vectors) with addition () defined. is a vector space if
-
1.
is closed under addition.
If and , then . In other words, is closed under vector addition.
-
2.
is closed under scalar multiplication.
If is a scalar in or , and , then .
-
3.
Zero element.
There exists some ‘zero element’ such that for all .
-
4.
Additive inverses.
For every , there exists some such that .
These concepts and rules apply to a great number of spaces, such as the familiar real numbers and vectors , but they also apply to sets such as polynomials, continuous functions, and matrices.
Space | Closure | Closure | Zero vector | Additive Inverses |
---|---|---|---|---|
✓ | ✓ | |||
✓ | ✓ | |||
Continuous functions | ✓ | ✓ | ||
Matrices | ✓ | ✓ |
Not everything is a vector space – even small differences can matter. Consider the set
Consider for the polynomials and . Both are of degree and their sum is also of degree . On the other hand, the polynomials and are also both of degree , and their sum is the zero polynomial, which is of degree . Thus the set is not a vector space.
Consider next the matrix equation
Call the solution set by
Does this space have the properties of a vector space? That is, do the solutions to the above matrix equation form a vector space? We will soon see that it does.
Let and take to be a scalar.
-
1.
Additive closure:
Observe that
Thus .
-
2.
Scalar multiplication:
Thus .
-
3.
Zero element:
If , then for any . , thus will always have a zero element.
-
4.
If , then and so .
Operations and Properties of Vector Spaces
There are several additional important properties of vector spaces that are important to the structure of vector spaces.
-
1.
is closed under : If and , then . In other words, is closed under vector addition.
-
2.
is closed under scalar multiplication: If is a scalar in or , and , then .
-
3.
Zero element: There exists some such that for all .
-
4.
Additive inverses: For every , there exists some such that .
-
5.
Commutativity of addition: for all .
-
6.
Associativity:
-
7.
For scalars , then
-
8.
Distributivity:
-
9.
Distributivity:
Subspaces
Let such that
-
1.
is non-empty.
-
2.
is a vector space in its own right with the operations of .
Given a properly selected subset of a vector space, many of the properties of the vector space will be carried over to the subspace. Many such properties carry over because they are results of the operations being the same between both sets.
Let us look at some examples of subspaces. Without naming a particular vector space, we can say some things already:
Question:
Is a subspace? Yes.
Question:
Is a subspace? Yes.
Let us consider a more complicated example. Consider . Is the -axis a vector space? Call this set :
Then if ,
Then
So is a subspace of . Similarly, the -axis is also a subspace by similar arguments.
Consider the following sets:
are they vector spaces?
The set is simply a line. It has slope and passes through the origin. Let . Then for some ,
Then
Thus is closed under addition. Furthermore,
so is closed under scalar multiplication.
Next consider . It is also a line, but we can note immediately that it does not contain the zero vector. What if we append a zero element to ? That is, let . We find that there are still problems – consider for instance the two vectors
both are clearly members of , but their sum is not:
Since , thus . Thus is not a vector space, and so it is not a subspace of .
Question:
What are all the subspaces of ?
Guess: and any line passing through .
Question:
What about and beyond?
A good place to start is with linear objects. What kind of linear objects exist in ? We know of lines and planes.
Consider
Is a subspace of . If we observe that is the set of solutions to a homogeneous system, then it must be, from our earlier observations.
Next consider that the system can be rewritten in terms of a dot product:
Thus is equivalently
in other words, is the set of vectors orthogonal to .
Note that
-
1.
-
2.
Let . Then
-
3.
The vector was chosen arbitrarily, and in fact any vector would induce a vector space composed of vectors orthogonal to it. Without proof, we seem to be finding that the subspaces of are the zero subspace, lines, planes, and the whole of itself.
Consider again the set of polynomials of degree or less:
Consider the subset ,
This set contains elements such as .
Is a subspace of ? Let . Then and hence
Consider that we can identify a polynomial of degree or less with a vector of length .
The set can similarly be represented:
The two spaces and seem to behave very similarly. In fact, there is a term called isomorphism which denotes that their structures are indeed essentially identical. The point of studying this is that these two seemingly different spaces behave in the same way. It is thus beneficial to study a generic model rather only sticking to one space, so that the concepts learned about vector spaces can be more widely applied.