Lecture 12
Recall
-
1.
Vector space: with ten axioms
-
2.
Subspace: a subset of a vector space that is a vector space on its own.
Recall that to ensure that a subset is a subspace of , we must have that
-
1.
is nonempty.
-
2.
is closed under addition.
-
3.
is closed under scalar multiplication.
These last two items can be understood as being closed under linear combinations. That is, is a subspace if for every and (scalars), then .
Defining Operations
The usual definitions for addition and multiplication are not the only ones possible. Let us look at some examples. Consider the set
with addition and scalar multiplication defined as
Question:
Is with these operations a vector space or not?
To answer this, we must check each of the requirements of a vector space.
Additive closure
Since this is the usual definition of addition, it is closed under addition.
Scalar multiplication
To check whether the scalar multiplication defined here satisfies the axioms, let and . Then and since , then , hence is closed under scalar multiplication.
Zero vector
Claim 1.
is the zero vector.
Proof.
∎
Since addition here is the same as the usual addition, if there is a problem with these definitions that prevents the space from being a vector space, it must have something to do with the multiplication operation.
Distributivity
A vector space must satisfy distributivity:
To check whether this holds in this space, let and . Then
Since will not always equal , the result fails and we can conclude that with these operations is not a vector space.
Though this set with these operations is not a vector space, can it be that perhaps other operations would maintain the vector space structure?
Consider the set
For clarity, define addition and scalar multiplication as follows:
where (the vector space) and (a scalar).
Question:
Is a vector space?
-
1.
Closure under
Let (so ). Then
Therefore .
-
2.
Closure under
Let and . Then since ,
Therefore .
-
3.
Zero element
Suppose some element is the zero element. Then for any ,
Thus would have to be the zero element. Since , thus and so the zero element is in fact .
-
4.
Inverse of an element
Let . Then we want some such that
where is the zero vector. Remember that we have just found that is the zero element, rather than the usual . Carrying out these operations, we have that
Since , thus , so . Thus . That is, for each , the element is its additive inverse.
-
5.
To check the axiom:
where and (so ).
Starting with the left-hand side:
we arrive at the right-hand side, indicating the axiom holds.
It turns out that this set does in fact satisfy all of the vector space axioms. With such strange operations, it brings to light that our standard vector spaces are by no means the only ones.
Question:
Is ?
Subspaces of
You may recall that all the subspaces of are:
-
1.
-
2.
-
3.
All straight lines through the origin.
To see why this is the case, let us outline the proof:
Proof.
Let be a subspace of . Note that if , then is indeed a subspace. Otherwise, let such that . Then since is a subspace, thus for all . Thus it contains the straight line along that passes through the origin.
If has no other vector, then is simply that straight line. Otherwise, assume there exists some non-zero that is linearly independent of .
Note then that (Why?) ∎
The above proof is not quite complete. The question is why we can gaurantee that ?
What this amounts to is whether for any vector there exist some scalars such that
This is a linear system with corresponding augmented matrix looking like:
Since , that is, they are linearly independent, then the coefficient matrix portion of this augmented matrix is invertible. Thus its columns will span .