Let's consider the system $ \begin{pmatrix} 1 & 1 \\ 2 & 3\end{pmatrix}\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 5 \\ 8\end{pmatrix}.$
The shortcut step in Gaussian elimination tells you to subtract 2 times the first row from the second, yielding $ \begin{pmatrix} 1 & 1 \\ 0 & 1\end{pmatrix}\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 5 \\ -2\end{pmatrix}.$
It's not so hard to see that this system is equivalent, but what are we actually doing behind the handwaving?
What's really happening is that we're left multiplying both sides of the equation by a new matrix:
$\begin{pmatrix} 1 & 0 \\ -2 & 1 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 2 & 3\end{pmatrix}\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ -2 & 1 \end{pmatrix} \begin{pmatrix} 5 \\ 8\end{pmatrix}.$
You might notice that in this case, the new matrix is lower triangular, and when we carry out the Gaussian elimination step, the resulting matrix is upper triangular.
In the $2 \times 2$ case, there is a single sub-diagonal element in any matrix. You may also notice that the Gaussian elimination requires only a single "step". If we extend this to $3 \times 3$ matrices, you will notice that you need to do 3 computations -- and there happen to be three sub-diagonal elements in any $3\times 3$ matrix. This is not a coincidence. It turns out, in fact, that Gaussian elimination is innately linked to LU decomposition -- the transformation of a full-rank square matrix into the product of lower and upper triangular matrices. This is very useful, because triangular matrices are spectacularly easy to invert.
The shorthand method for Gaussian elimination gives you the following system: $LAx = Lb.$
But we could just as easily obtain $Ax=b \Longrightarrow LUx = b \Longrightarrow Ux = L^{-1}b.$
The key difference as to which operation to choose is whether you're trying to invert $A$ or solve $Ax=b$. For solving $Ax=b$, the shorthand Gaussian elimination turns out to be easiest.