Linear Algebra with Applications by Steven J. Leon, p.257:
Theorem 5.5.2: Let $\{\textbf{u}_1, \textbf{u}_2, \ldots, \textbf{u}_n\}$ be an orthonormal basis for an inner product space $V$. If $\textbf{v} = \sum_{i=1}^{n} c_i \textbf{u}_i$, then $c_i = \langle \textbf{v}, \textbf{u}_i \rangle$.
I don't know if I need to include the proof, but it's short, so here it is:
Definition: $\delta_{ij} = \begin{cases} 1 & \text{if } i = j \\ 0 & \text{if } i \neq j \end{cases}$
Proof: $\langle \textbf{v}, \textbf{u}_i \rangle = \left< \sum_{j=1}^{n} c_j \textbf{u}_j, \textbf{u}_i \right> = \sum_{j=1}^{n} c_j \langle \textbf{u}_j, \textbf{u}_i \rangle = \sum_{j=1}^{n} c_j \delta_{ij} = c_i$
Now here's the example given in the book. He seems to be using the theorem's logic backwards. I don't get it.
Example: The vectors $ \textbf{u}_1 = \left(\dfrac{1}{\sqrt{2}},\dfrac{1}{\sqrt{2}}\right)^T \text{ and } \textbf{u}_2 = \left(\dfrac{1}{\sqrt{2}},-\dfrac{1}{\sqrt{2}}\right)^T$ form an orthonormal basis for $\mathbb{R}^2$. If $\textbf{x} \in \mathbb{R}^2$, then $\textbf{x}^T \textbf{u}_1 = \dfrac{x_1 + x_2}{\sqrt{2}} \text{ and } \textbf{x}^T \textbf{u}_2 = \dfrac{x_1 - x_2}{\sqrt{2}}$ It follows from Theorem 5.5.2 that $\textbf{x} = \dfrac{x_1 + x_2}{\sqrt{2}} \textbf{u}_1 + \dfrac{x_1 - x_2}{\sqrt{2}} \textbf{u}_2$
Isn't "$\textbf{x} = \dfrac{x_1 + x_2}{\sqrt{2}} \textbf{u}_1 + \dfrac{x_1 - x_2}{\sqrt{2}} \textbf{u}_2$" referring to "$\textbf{v} = \sum_{i=1}^{n} c_i \textbf{u}_i$", and "$\textbf{x}^T \textbf{u}_1 = \dfrac{x_1 + x_2}{\sqrt{2}} \text{ and } \textbf{x}^T \textbf{u}_2 = \dfrac{x_1 - x_2}{\sqrt{2}}$" referring to "$c_i = \langle \textbf{v}, \textbf{u}_i \rangle$"? Shouldn't the latter follow from the former? Or should the theorem state if and only if? Or am I just confused?