I am a little stuck on the following problem:
By using the Gram-Schmidt Orthogonalization, find an orthonormal basis for the subspace of $L^2[0,1]$ spanned by $1,x, x^2, x^3$.
OK, so I have defined:
$e_1 = 1$
I would then assume that we proceed as follows:
$e_2 = x - \frac{\langle x,1 \rangle}{\langle 1,1 \rangle} \cdot 1$
And we have:
$\langle x,1 \rangle = \int_{0}^{1}x dx = \frac{1}{2}$
And:
$\langle 1,1 \rangle = 1$
So:
$e_2 = x - \frac{1}{2}$
But already at this point, my answer is wrong. According to my book, the correct answer here should be:
$e_2 = \sqrt{12}(x - \frac{1}{2})$
And this obviously makes all my subsequent answers also incorrect.
So what is it I'm doing wrong here? I simply don't see where the $\sqrt{12}$ term comes from. Any help would be greatly appreciated!