Let me first illustrate an alternate approach. You're looking at $\left[\begin{array}{ccc} \alpha & 1 & \alpha^2\\ 1 & \alpha & 1\\ 1 & \alpha^2 & 2\alpha\\ 1 & 1 & \alpha^2 \end{array}\right]\left[\begin{array}{c} x_1\\ x_2\\ x_3\end{array}\right]=\left[\begin{array}{c} -\alpha\\ 1\\ 2\alpha\\ -\alpha\end{array}\right].$ We can use row reduction on the augmented matrix $\left[\begin{array}{ccc|c} \alpha & 1 & \alpha^2 & -\alpha\\ 1 & \alpha & 1 & 1\\ 1 & \alpha^2 & 2\alpha & 2\alpha\\ 1 & 1 & \alpha^2 & -\alpha \end{array}\right].$ In particular, for the system to be solvable, it is necessary and sufficient that none of the rows in the reduced matrix is all $0$'s except for in the last column. Subtract the bottom row from the other rows, yielding $\left[\begin{array}{ccc|c} \alpha-1 & 0 & 0 & 0\\ 0 & \alpha-1 & 1-\alpha^2 & 1+\alpha\\ 0 & \alpha^2-1 & 2\alpha-\alpha^2 & 3\alpha\\ 1 & 1 & \alpha^2 & -\alpha \end{array}\right].$
It's clear then that if $\alpha=1$, the second row has all $0$s except in the last column, so $\alpha=1$ doesn't give us a solvable system. Suppose that $\alpha\neq 1$, multiply the top row by $\frac1{\alpha-1}$, and subtract the new top row from the bottom row, giving us $\left[\begin{array}{ccc|c} 1 & 0 & 0 & 0\\ 0 & \alpha-1 & 1-\alpha^2 & 1+\alpha\\ 0 & \alpha^2-1 & 2\alpha-\alpha^2 & 3\alpha\\ 0 & 1 & \alpha^2 & -\alpha \end{array}\right].$
Swap the second and fourth rows and add the new second row to the last two rows, giving us $\left[\begin{array}{ccc|c} 1 & 0 & 0 & 0\\ 0 & 1 & \alpha^2 & -\alpha\\ 0 & \alpha^2 & 2\alpha & 2\alpha\\ 0 & \alpha & 1 & 1 \end{array}\right],$ whence subtracting $\alpha$ times the fourth row from the third row gives us $\left[\begin{array}{ccc|c} 1 & 0 & 0 & 0\\ 0 & 1 & \alpha^2 & -\alpha\\ 0 & 0 & \alpha & \alpha\\ 0 & \alpha & 1 & 1 \end{array}\right].$
Note that $\alpha=0$ readily gives us the solution $x_1=x_2=0$, $x_3=1$. Assume that $\alpha\neq 0,$ multiply the third row by $\frac1\alpha$, subtract the new third row from the fourth row, and subtract $\alpha^2$ times the new third row from the second row, yielding $\left[\begin{array}{ccc|c} 1 & 0 & 0 & 0\\ 0 & 1 & 0 & -\alpha^2-\alpha\\ 0 & 0 & 1 & 1\\ 0 & \alpha & 0 & 0 \end{array}\right],$ whence subtracting $\alpha$ times the second row from the fourth row yields $\left[\begin{array}{ccc|c} 1 & 0 & 0 & 0\\ 0 & 1 & 0 & -\alpha^2-\alpha\\ 0 & 0 & 1 & 1\\ 0 & 0 & 0 & \alpha^3+\alpha^2 \end{array}\right].$ The bottom right entry has to be $0$, so since $\alpha\neq 0$ by assumption, we need $\alpha=-1$, giving us $\left[\begin{array}{ccc|c} 1 & 0 & 0 & 0\\ 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 1\\ 0 & 0 & 0 & 0 \end{array}\right].$
Hence, the two values of $\alpha$ that give the system a solution are $\alpha=0$ and $\alpha=-1$, and in both cases, the system has solution $x_1=x_2=0$, $x_3=1$. (I think all my calculations are correct, but I'd recommend double-checking them.)
The major upside of the determinant approach is that it saves you time and effort, since you've already calculated it. If we assume that $\alpha$ is a constant that gives us a solution, then since we're dealing with $4$ equations in only $3$ variables, we have to have at least one of the rows in the reduced echelon form of the augmented matrix be all $0$s--we simply don't have enough degrees of freedom otherwise. The determinant of the reduced matrix will then be $0$, and since we obtain it by invertible row operations on the original matrix, then the determinant of the original matrix must also be $0$.
By your previous work, then, $-\alpha^3(\alpha-1)(1+\alpha)=0$, so the only possible values of $\alpha$ that can give us a solvable system are $\alpha=0$, $\alpha=-1$, and $\alpha=1$. We simply check the system in each case to see if it actually is solvable. If $\alpha=0$, we readily get $x_1=x_2=0$, $x_3=1$ as the unique solution; similarly for $\alpha=-1$. However, if we put $\alpha=1$, then the second equation becomes $x_1+x_2+x_3=1,$ but the fourth equation becomes $x_1+x_2+x_3=-1,$ so $\alpha=1$ does not give us a solvable system.