$ker A=0$ implies $A$ is full Row Rank, i.e. $rank(A)=n$ which implies $m\geq n$. That is, $A$ is a tall matrix. The reduced (or thin) QR decomposition of $A$ is thus $A=QR$ where $Q$ will be a (semi) orthogonal $m\times n$ matrix such that $Q^{T}Q=I$ and $R$ is a $n\times n$ triangular matrix. So $Ax=b$ is $QRx=b$, multiply by $Q^{T}$ on both sides, you get $Rx=Q^{T}b$. Let $x=[x_1,..,x_n]^{T}$, observe the last row of the equation $Rx=Q^{T}b$. It will be $R_{n,n}x_n=[Q^{T}b]_n$ (since $R$ is triangular). Look the second last row, it will be $R_{n-1,n-1}x_{n-1}+R_{n-1,n}x_{n}=[Q^{T}b]_{n-1}$. Clearly you can solve for $x_n$ from the former and "back substitute" in the latter to find $x_{n-1}$ and so progressively you can move upward by back substituting and solving for each variable at each step.