Suppose $\mathbf{V}$ is an inner product vector space, and $\mathbf{W}$ is a subspace. If $\beta=\{\mathbf{w}_1,\ldots,\mathbf{w}_k\}$ is an orthonormal basis for $\mathbf{W}$, then the orthogonal projection onto $\mathbf{W}$ can be computed using $\beta$: given a vector $\mathbf{v}$, the orthogonal projection onto $\mathbf{W}$ is $\pi_{\mathbf{W}}(\mathbf{v}) = \langle \mathbf{v},\mathbf{w}_1\rangle \mathbf{w}_1+\cdots + \langle \mathbf{v},\mathbf{w}_k\rangle \mathbf{w}_k.$
If you only have an orthogonal basis, then you need to divide each factor by the square of the norm of the basis vectors. That is, if you have an orthogonal basis $\gamma = \{\mathbf{z}_1,\ldots,\mathbf{z}_k\}$, then the projection is given by: $\pi_{\mathbf{W}}(\mathbf{v}) = \frac{\langle\mathbf{v},\mathbf{z}_1\rangle}{\langle \mathbf{z}_1,\mathbf{z}_1\rangle}\mathbf{z}_1 + \cdots + \frac{\langle\mathbf{v},\mathbf{z}_k\rangle}{\langle\mathbf{z}_k,\mathbf{z}_k\rangle}\mathbf{z}_k.$
Here, you have a subspace for which you say you already have an orthogonal basis. And you have your vector: $\mathbf{v} = x$. So all you have to do is use the usual formula with these vectors and this inner product. For example, with $\mathbf{v}=x$ and $\mathbf{z}_1 = -x + 1$, we have: $\langle x,-x+1\rangle = (0)(-0+1) + (1)(-1+1) + (2)(-2+1) = 0+0-2 = -2.$
Etc.