Remember that if $\mathbf{V},\mathbf{W}$ are vector spaces, $\mathbf{v}_1,\ldots,\mathbf{v}_n$ is a basis, and $f\colon\{\mathbf{v}_1,\ldots,\mathbf{v}_n\}\to \mathbf{W}$ is any function, then $f$ uniquely determines a linear tranformation $T\colon\mathbf{V}\to\mathbf{W}$ such that $T(\mathbf{v}_i) = f(\mathbf{v}_i)$. That is: we can define a linear transformation by specifying what it does to a basis, and then "extending linearly"; the definition of $T$ is that given any $\mathbf{v}\in \mathbf{V}$, we express $\mathbf{v}$ in terms of the basis, $\mathbf{v} = \alpha_1\mathbf{v}_1+\cdots+\alpha_n\mathbf{v}_n$ and then let $T(\mathbf{v}) = \alpha_1f(\mathbf{v}_1)+\cdots+\alpha_nf(\mathbf{v}_n).$
In the definition you give, we are specifying the function $x_i^*$ by saying what it does on the basis $\{x_1,\ldots,x_n\}$. The actual function will be given by linearity.
So given an element $\mathbf{v}\in\mathbf{V}$, the way to compute $x_i^*(\mathbf{v})$ is as follows:
First, express $\mathbf{v}$ in terms of the basis $x_1,\ldots,x_n$: $\mathbf{v}=\alpha_1x_1+\cdots+\alpha_nx_n$ for some (uniquely determined) scalars $\alpha_1,\ldots,\alpha_n$.
Then we compute $x_i^*$ on $\mathbf{v}$ by the rule: $x_i^*(\mathbf{v}) = \alpha_1x_i^*(x_1)+\alpha_2x_i^*(x_2)+\cdots + \alpha_nx_i^*(x_n).$
This gives: $x_i^*(\mathbf{v}) = \alpha_1\delta_{i1} + \alpha_2\delta_{i2}+\cdots+\alpha_n\delta_{in} = \alpha_i.$
(Instead, you seem to think that the way to compute it is by taking $x_i^*(\mathbf{v})$, and the value will be $1$ if $\mathbf{v}=x_i$ and $0$ if $\mathbf{v}\neq x_i$. You are absolutely correct that such a function is not linear, let alone a linear functional, but luckily that is not the definition of $x_i^*$...)
In the case of $V=\mathbb{R}$ and $x_1 = 1$, the functional $x_1^*$ is completely determined by its value at $1$, which is $1$. However, the actual definition of the function as a linear transformation is:
Given $\alpha\in\mathbb{R}$, write $\alpha$ as a linear combination of $x_1$, $\alpha = \alpha x_1$. Then $x_1^*(\alpha) = \alpha x_1^*(1) = \alpha$.
So your computation is incorrect: $x_1^*(1+(-1)) = x_1^*(0) = x_1^*(0\cdot x_1) = 0x_1^*(x_1) = 0(1) = 0,$ as you compute, but to compute $x_1^*(1)$ and $x_1^*(-1)$, you need to first express these vectors in terms of the basis $\{x_1\}$ (which you did not do). You get: $x_1^*(1) + x_1^*(-1) = x_1^*(1x_1) + x_1^*(-1x_1) = 1x_1^*(x_1)+(-1)x_1^*(x_1) = 1\delta_{11}+(-1)\delta_{11} = 0$ as well.