There is no such thing as the eigenvector of a matrix, or the orthonormal basis of eigenvectors. There are usually many choices.
Remember that an eigenvector $\mathbf{v}$ of eigenvalue $\lambda$ is a nonzero vector $\mathbf{v}$ such that $T\mathbf{v}=\lambda\mathbf{v}$. That means that if you take any nonzero multiple of $\mathbf{v}$, say $\alpha\mathbf{v}$, then we will have $T(\alpha\mathbf{v}) = \alpha T\mathbf{v} = \alpha(\lambda\mathbf{v}) = \alpha\lambda\mathbf{v}=\lambda(\alpha\mathbf{v}),$ so $\alpha\mathbf{v}$ is also an eigenvector corresponding to $\lambda$. More generally, if $\mathbf{v}_1,\ldots,\mathbf{v}_k$ are all eigenvectors of $\lambda$, then any nonzero linear combination $\alpha_1\mathbf{v}_1+\cdots+\alpha_k\mathbf{v}_k\neq \mathbf{0}$ is also an eigenvector corresponding to $\lambda$.
So, of course, since $\left(\begin{array}{r}-1\\1\end{array}\right)$ is an eigenvector (corresponding to $x=0$), then so is $\alpha\left(\begin{array}{r}-1\\1\end{array}\right)$ for any $\alpha\neq 0$, in particular, for $\alpha=-1$ as you take.
Now, a set of vectors $\mathbf{w}_1,\ldots,\mathbf{w}_k$ is orthogonal if and only if $\langle \mathbf{w}_i,\mathbf{w}_j\rangle = 0$ if $i\neq j$. If you have an orthogonal set, and you replace, say, $\mathbf{w}_i$ by $\alpha\mathbf{w}_i$ with $\alpha$ any scalar, then the result is still an orthogonal set: because $\langle\mathbf{w}_k,\mathbf{w}_j\rangle=0$ if $k\neq j$ and neither is equal to $i$, and for $j\neq i$, we have $\langle \alpha\mathbf{w}_i,\mathbf{w}_j\rangle = \alpha\langle\mathbf{w}_i,\mathbf{w}_j\rangle = \alpha 0 = 0$ by the properties of the inner product. As a consequence, if you take an orthogonal set, and you take any scalars $\alpha_1,\ldots,\alpha_k$, then $\alpha_1\mathbf{w}_1,\ldots,\alpha_k\mathbf{w}_k$ is also an orthogonal set.
A vector $\mathbf{n}$ is normal if $||\mathbf{n}||=1$. If $\alpha$ is any scalar, then $||\alpha\mathbf{n}|| = |\alpha|\,||\mathbf{n}|| = |\alpha|$. So if you multiply any normal vector $\mathbf{n}$ by a scalar $\alpha$ of absolute value $1$ (or of complex norm $1$), then the vector $\alpha\mathbf{n}$ is also a normal vector.
A set of vectors is orthonormal if it is both orthogonal, and every vector is normal. By the above, if you have a set of orthonormal vectors, and you multiply each vector by a scalar of absolute value $1$, then the resulting set is also orthonormal.
In summary: you have an orthonormal set of two eigenvectors. You multiply one of them by $-1$; this does not affect the fact that the two are eigenvectors. The set was orthogonal, so multiplying one of them by a scalar does not affect the fact that the set is orthogonal. And the vectors were normal, and you multiplied one by a scalar of absolute value $1$, so the resulting vectors are still normal. So you still have an orthonormal set of two eigenvectors. I leave it to you to verify that if you have a linearly independent set, and you multiply each vector by a nonzero scalar, the result is still linearly independent.