0
$\begingroup$

Let $(\mathbb R^n,d)$ be an Euclidean space where $d$ is the Euclidean distance function.

Let two vectors $(a_1,...,a_n)$ and $(b_1,...,b_n)$ be called orthogonal iff $\displaystyle \sum_{i=1}^n a_ib_i$.

Let a vector $(v_1,...,v_n)$ be called a unit vector iff $d\big((0,...,0),(v_1,...,v_n)\big)=1$

How to prove that set $S$ with $n$ elements such that all $s \in S$ are orthogonal to each other and have unit length forms a basis for that vector space?

There was a question like this already, but the answer there is in form of a tip which I can't understand, not solution, and it's pointless to expect an answer when responding to a 3 year old comment.

  • 0
    under what conditions does a set of vectors form a basis?2017-02-23
  • 0
    @Student Linearly independent, spans the space. I'm not sure, but I also think that if it has $n$ linearly independent vectors, it's a basis too.2017-02-23
  • 0
    yes, so If you can show that the set of orthogonal vectors (they do not even have the have length 1) forms a linearly independent set, then you are done (note that the dimension of $\mathbb{R}^n$ over $\mathbb{R}$ is exactly $n$, so if you find a linearly independent set of $n$ vectors you are done. + could you edit your question? orthogonality is if that sum you wrote equals zero, but i can not add this since it is less than 6 characters.2017-02-23
  • 0
    @Student That much I could gather from the previous thread I've said about. I don't know how to achieve at this proof.2017-02-23
  • 0
    I will write an answer for you. Are you aware of the inproduct of vectors?2017-02-23
  • 0
    @Student Yes, although only that it's equal to $\displaystyle \sum_{i=1}^n x_iy_1$ and $d(0,x)d(0,y)cos \,\theta$.2017-02-23
  • 0
    Suppose $ \sum_{i=1}^n a_i v_i = 0 $ for some $ a_i $ and assume that we have $ < v_i , v_j > = \delta{i,j} $ then by linearity you get $ < \sum_{i=1}^n a_i v_i , v_j > = \sum_{i=1}^n a_i < v_i,v_j > = \sum_{i=1}^n a_i \delta_{i,j} = a_j = 0 $ , hence proved ...2017-02-23

2 Answers 2

1

Let us denote the set of $n$ orthonormal vectors in $\mathbb{R}^n$ as follows: $$S = \{e_1, e_2, \ldots e_n\}$$ where $e_i = (e_{1i}, e_{2i}, \ldots e_{ni}) \in \mathbb{R}^n$ and $e_i, e_j$ are orthogonal if $i \neq j$.

Consider a the following linear combination: $$0 = a_1e_1 + a_2 e_2 + \ldots + a_ne_n$$ where the $a_i \in \mathbb{R}$ and $0 = (0,0, \ldots, 0)$. If we can show that all $a_i$ are zero, we find that the vectors in $S$ are linearly independent. Let us use the orthogonality of the vectors in this set to do so. Consider the following inproduct (I will denote the inproduct by $\cdot$: $$e_i \cdot (a_1e_1 + a_2 e_2 + \ldots + a_ne_n).$$ Because of linearity, we find that this is equal to $$a_1 (e_i \cdot e_1) + \ldots a_i (e_i \cdot e_i) + \ldots a_n (e_i \cdot e_n).$$ Because $e_i$ is orthogonal to every $e_j$ with $i \neq j$, we have that $e_i \cdot e_j = 0$ if and only if $i \neq j$. For $e_i \cdot e_i$, we have that this is non-zero (since the vectors in $S$ are orthonormal, we have that $e_i \cdot e_i = 1$, try to find this yourself). Hence we find that $$a_1 (e_i \cdot e_1) + \ldots a_i (e_i \cdot e_i) + \ldots a_n (e_i \cdot e_n) = 0 + \ldots + a_i + \ldots 0 = a_i.$$

However, we have that $e_i \cdot (0,0, \ldots, 0) = 0$, so we find that $a_i = 0$ and this holds for every $i \in \{1, \ldots, n\}$. So the vectors in $S$ are linearly independent.

Since the dimension of $\mathbb{R}^n$ is $n$, we have that $S$ forms a basis of this vectorspace.

$\textbf{Edit:}$ since this proof only uses that $e_i$ are of norm 1 in order to show that $e_i \cdot e_i \neq 0$, the same proof holds to show that any set of orthogonal vectors is lineary independent!

  • 0
    "we have that $e_i⋅(0,0,…,0)=0$, so we find that $a_i=0$" How do we find that exacty?2017-02-23
  • 0
    Use your definition of orthodontist: you find that $e_i \cdot (0,0, \ldots, 0) = \sum_{j= 1} ^n e_{ji} \times 0 = 0$.2017-02-23
  • 0
    I know how to get this result, I don't know how $a_i=0$ follows2017-02-23
  • 0
    Well, $0 = e_i \cdot (0,0, ...,0) = a_i$2017-02-23
  • 0
    I computed $e_i \cdot ($linear combination), which was $a_i$, but the linear combination was zero, so $a_i =0$.2017-02-23
0

Here’s another way to show that an orthonormal set of vectors $\{u_i\}$ is linearly independent. Assemble them into a matrix $U$. The elements of the product $U^TU$ are the pairwise inner products $(u_i,u_j)=\delta_{ij}$ of these vectors, so $U^TU=I$ and thus $\det(U^TU)=(\det{U^T})(\det U)=(\det U)^2=1$. This means that $U$ is nonsingular, so its rows and columns are linearly independent.