I recommend you to keep the matrix-vector product style for these problems. Here is the problem again
$ \pmatrix{1&1&2\\2&3&-1\\3&4&1}\pmatrix{x_1\\x_2\\x_3} = \pmatrix{2\\5\\a} $ This is the common $Ax=b$ type or set of linear equations. Now if the matrix is invertible then we can directly multiply from the left by $A^{-1}$ and obtain the solution $x=A^{-1}b$. But we can not since, without the details, the matrix is not invertible. That tells us that some of the rows/columns are linearly dependent to the others.
As AMPerrine pointed out, we are lucky to see that the sum of first two rows equals the third one. This means there are in fact 3 unknowns but only 2 independent equations. Let us show what we have concluded by using a matrix multiplying from the left as follows: $ \pmatrix{1&0&0\\0&1&0\\-1&-1&1}\pmatrix{1&1&2\\2&3&-1\\3&4&1}\pmatrix{x_1\\x_2\\x_3} = \pmatrix{1&0&0\\0&1&0\\-1&-1&1}\pmatrix{2\\5\\a} $ This leads to $ \pmatrix{1&1&2\\2&3&-1\\0&0&0}\pmatrix{x_1\\x_2\\x_3} = \pmatrix{2\\5\\a-7} $ Now the last equation gives $0=a-7$ and this is the requirement for consistency.
After obtaining a consistent set of equations, we can discard the last equation and continue with the remaining ones. Because as we showed it introduces no extra information. $ \pmatrix{1&1&2\\2&3&-1}\pmatrix{x_1\\x_2\\x_3} = \pmatrix{2\\5} $ This leads to the famous Gauss elimination method where we try to simplify things as much as possible via obtaining zero entries in the matrix: $ \pmatrix{1&0\\-2&1}\pmatrix{1&1&2\\2&3&-1}\pmatrix{x_1\\x_2\\x_3} = \pmatrix{1&0\\-2&1}\pmatrix{2\\5} $ This gives, $ \pmatrix{1&1&2\\0&1&-5}\pmatrix{x_1\\x_2\\x_3} = \pmatrix{2\\1} $ This makes it possible to obtain the solution family : from the second equation we obtain $x_2 = 1+5x_3$ and plugging this into the first gives $x_1+7x_3=1$