2
$\begingroup$

Let $V$ be a vector space with a finite Dimension above $\mathbb{C}$ or $\mathbb{R}$.

Let $B=(v_{1},v_{2},...,v_{n})$ be a basis of V.

How can I prove that there is an Inner product $\langle,\rangle$ which in respect to it, B is an orthonormal basis.

Thank you

  • 4
    **Hint:** By bilinearity (or sesquilinearity), defining an inner product on the basis (where the definition is forced upon us by the orthonormality condition) already defines it on the whole space. Now check that this forced expression actually gives an inner product.2011-05-12
  • 0
    @Theo: I think you should post it as an answer.2011-05-12
  • 0
    @Dennis: Actually, I was going to elaborate on this hint in case no-one else did, but as you asked for me to post it, I'm doing that now.2011-05-12
  • 0
    I'd really appreciate an answer that doesn't invole bilinearity, eventhough I know what it is.2011-05-12
  • 1
    Nir: I'm not *invoking* bilinearity in my answer below (if you read it closely), but it is the *whole point*!2011-05-12
  • 0
    Why the vote to close as not a real question? I'd really like to see an explanation.2011-05-12
  • 3
    The simplest way is to fix an isomorphism $T\colon V\to F^n$, where $F$ is the ground field, that maps $B$ to the standard basis of $F$. Then define the inner product on $V$ by $\langle v,w\rangle_V = \langle T(v),T(w)\rangle_{F}$. Because $B$ is mapped to an orthonormal basis of $F^n$, this inner product makes $B$ into an orthonormal basis.2011-05-12

2 Answers 2

12

Meta: I'm adding a second answer only performing the necessary calculations and not mentioning any sophisticated words. Nir, I'm sorry for having aimed a bit too high in the hope of telling you something useful.

Let $x = x_{1} v_{1} + \cdots + x_{n} v_{n}$ and $y = y_{1} v_{1} + \cdots + y_{n} v_{n}$ be any two vectors of $\mathbb{C}^{n}$. Define the expression $$\langle x, y \rangle = \sum_{i = 1}^{n} x_{i} \, \overline{y_i}.$$ I claim that this defines a scalar product such that $(v_{1},\ldots,v_{n})$ is orthonormal.

To see that $\langle \cdot, \cdot \rangle$ is an inner product, we need to check that for all $x,y,z \in \mathbb{C}^{n}$ and $\lambda \in \mathbb{C}$:

  1. $\overline{\langle y, x \rangle} = \langle x, y \rangle$
  2. $\langle x + z, y \rangle = \langle x, y \rangle + \langle z, y \rangle$.
  3. $\langle \lambda x, y \rangle = \lambda\,\langle x,y \rangle$.
  4. $\langle x, x \rangle \geq 0$ and $\langle x, x \rangle = 0$.
  5. $\langle x, x \rangle = 0$ if and only if $x = 0$.

I'm going to do 1., 2. and 4., and leave 3. and 5. to you. Let's do 1. now: $$ \overline{\langle y, x \rangle} = \overline{\sum_{i = 1}^{n} y_{i} \, \overline{x_{i}}} = \sum_{i = 1}^{n} \overline{y_{i} \, \overline{x_{i}}} = \sum_{i = 1}^{n} \overline{y_{i}} \, x_{i} = \sum_{i = 1}^{n} x_{i} \, \overline{y_{i}} = \langle x, y \rangle.$$ Let's do 2.: $$ \langle x + z, y \rangle = \sum_{i =1}^{n} (x_{i} + z_{i})\,\overline{y_{i}} = \sum_{i=1}^{n} \left( x_{i}\,\overline{y_{i}} + z_{i}\,\overline{y_{i}} \right) = \sum_{i=1}^{n} x_{i}\,\overline{y_{i}} + \sum_{i = 1}^{n} z_{i} \, \overline{y_{i}} = \langle x,y \rangle + \langle z, y \rangle.$$ Let's do 4.: $$ \langle x, x \rangle = \sum_{i = 1}^{n} x_{i} \, \overline{x_{i}} = \sum_{i=1}^{n} |x_{i}|^2 \geq 0$$ since $x_{i} \, \overline{x_{i}} = |x_{i}|^2 \geq 0$.

Then we need to check that $(v_{1},\ldots,v_{n})$ is an orthonormal basis. I'll do two cases: $$\langle v_1, v_1 \rangle = 1 \cdot \overline{1} + 0 \cdot \overline{0}+ \cdots + 0 \cdot \overline{0} = 1$$ $$\langle v_1, v_{2}\rangle = 1 \cdot \overline{0} + 0 \cdot \overline{1} + 0 \cdot \overline{0} + \cdots + 0 \cdot \overline{0} = 0.$$ This should be enough for you to check that $\langle v_{k}, v_{k} \rangle = 1$ and $\langle v_{k}, v_{l} \rangle = 0$ if $k \neq l$ (actually $k \lt l$ suffices in view of condition 1. above.

  • 0
    Thank you so much. sometimes simplicity is the answer. It was very helpful,and when I'll go through bilinearity or one of the terms you used in the previous answer, I'll sure go back to understand it more.2011-05-12
  • 0
    @Nir: I let myself get a little carried away. Don't worry if you don't understand the other answer yet, but in some time I hope you'll see what I was trying to tell you.2011-05-12
  • 0
    @Nir: One more thing: note that the formula I wrote down for the scalar product is the one you obtain if you're making Arturo's comment to your question explicit (by taking the isomorphism that sends $(v_{1},\ldots,v_{n})$ to the standard basis of $\mathbb{C}^{n}$).2011-05-13
12

I was asked to post my hint as an answer. I'm going to elaborate a little, though.

The first thing to observe is that the inner product only depends on its values on a basis. Let me restrict to complex vector spaces (for me an inner product is linear in the first variable and anti-linear in the second one). So let $\langle\cdot,\cdot\rangle$ be an inner product on $\mathbb{C}^n$ and let $(v_1,\ldots,v_n)$ be a basis. Now if $x = x_{1} v_{1} + \cdots + x_{n}v_{n}$ and $y = y_{1} v_{1} + \cdots + y_{n} v_{n}$ are two arbitrary vectors, we get by using sesquilinearity that $$\langle x,y \rangle = \sum_{i,j = 1}^{n} \langle v_{i}, v_{j}\rangle\, x_{i} \, \bar{y}_{j}.$$ This shows that the values $a_{ij} = \langle v_{i},v_{j} \rangle$, $i,j = 1,\ldots,n$ determine the inner product entirely.

Let us consider the $n \times n$ matrix $A = (a_{ij})_{i,j =1}^{n}$ a bit more closely. First of all, notice that $$a_{ij} = \langle v_{i},v_{j} \rangle = \overline{\langle v_{j},v_{i} \rangle} = \overline{a_{ji}}$$ by the symmetry of the inner product. Writing this in matrix form, this is equivalent to $A^{\ast} = A$, where $A^{\ast}$ is the conjugate-transpose of $A$. In general, a matrix $A$ satisfying $A^{\ast} = A$ is called Hermitian. The second observation is that we can write the inner product as $$\langle x, y \rangle = (A \cdot y)^{\ast} \cdot x,$$ where $\cdot$ denotes the matrix product. Now the positive definiteness condition $\langle x, x \rangle \gt 0$ for all $x \neq 0$ can't be expressed as a simple condition on $A$ (but for instance, the combination of diagonal dominance and "Hermitian-ness" is sufficient - off-topic: is the ugly concoction "Hermitian-ness" used in the present case or even "Hermitianity"?! - be that as it may, the link on positive definiteness contains a number of good conditions).

The point I'm heading at is the following exercise you may or may not want to do (I've done one half of it in the above and left you the easier part):

Exercise: If $A = (a_{ij})_{i,j = 1}^{n}$ is an $n \times n$ Hermitian positive definite matrix, then the expression $$ \langle x, y \rangle_{A} := (A \cdot y)^{\ast} \cdot x = \sum_{i,j = 1}^{n} a_{ij}\,x_{i}\,\overline{y_{j}}$$ defines an inner product on $\mathbb{C}^{n}$ and conversely, given an inner product, we get a Hermitian positive definite matrix $A$ by the procedure described above.

Finally, I'm addressing your actual question, so let $(v_{1},\ldots,v_{n})$ be a basis of $\mathbb{C}^{n}$. The condition that this basis should be orthonormal with respect to a stipulated inner product $\langle \cdot, \cdot\rangle$ states that the matrix $A = (a_{ij})_{i,j=1}^{n} = (\langle v_{i},v_{j} \rangle)_{i,j=1}^{n}$ must be the $n \times n$-identity matrix $I_{n}$ (why?). Now it is rather straightforward to check that the expression

$$\langle x, y \rangle_{I_{n}} = \sum_{i = 1}^{n} x_{i} \overline{y_{i}}$$

actually is an inner product on $\mathbb{C}^{n}$ for which $v_{1},\dots,v_{n}$ is an orthonormal basis (note that I'm expressing $x,y$ in terms of the basis $(v_{1},\ldots,v_{n})$!).

The case of $\mathbb{R}^{n}$ is very similar, simply omit all the complex conjugates, replace conjugate-transpose by ordinary transpose and Hermitian by symmetric.

  • 2
    I don't know of any standards, but if I were in your shoes, I'm for "Hermitian-ness" (unless something better-sounding comes up). I'll upvote in 12 hours...2011-05-12
  • 0
    @J.M. Thanks for both things! (meta: this rep-cap is a bit annoying...)2011-05-12
  • 0
    sesquilinearity is another word for bilinearity?2011-05-12
  • 1
    @Nir: Sesquilinear means literally $1 \frac{1}{2}$-linear (in Latin). It means linear in the first variable and *anti-linear* in the second one (the missing half of linearity the *anti-linearity* in the second variable i.e., *additivity* $\langle x, y + z \rangle = \langle x,y \rangle + \langle x, z \rangle$ and *anti-homogeneity* $\langle x, \lambda y \rangle = \bar{\lambda} \, \langle x, y \rangle$ - note the bar! It is the appropriate replacement for bilinearity of inner products in the complex case (to have $\langle i\cdot v_{1}, i\cdot v_{1} \rangle = 1$ instead of $-1$ by bilinearity).2011-05-12
  • 0
    @Nir: Can you be a bit more specific as to what is unclear? Let's discuss it, I'm here for a while and I'll try to answer your queries.2011-05-12
  • 0
    @Theo: Thank you for the answer, But I have to admit it's too complicated and hard to understand for me.2011-05-12
  • 0
    @Theo: Ok, first I do not familiar with those terms you just wrote, I looked , if it's exist, for something much simpler.2011-05-12
  • 0
    @Nir: Okay, I understand. You asked about the complex case, so I did that one. May I ask you: If you think of the real case, replace sesquilinear by bilinear and read the first longer paragraph (until entirely) and then skip to the part *after* the exercise. Do you understand what I'm saying? By the way: what I call sesquilinear is sometimes called Hermitian.2011-05-12
  • 0
    Regarding the sesquilinear-Hermitian, I got that. But, as I wrote, I'm asking for something that doesn't use bilinearity. If I reached one, I'll let you know here. Thanks.2011-05-12
  • 0
    @Nir: But as I said, I *insist* that I'm *not using* bilinearity for the actual answer! The answer you're looking for *is* in the last paragraph (starting with Now...) I'm doing *two* things: first, I *start* with an inner product and produce a matrix (yes, here I'm using bilinearity). The second thing is just writing down a formula (the last one) which is *motivated* by all the considerations above. But the formula works directly (and I'm arguing why it is the *only possible one*), and you just need to check that it *defines an inner product* s.t. the basis $(v_{i})$ *is* orthonormal wrt it.2011-05-12