Example $\mathbf{Q} \mathbf{R}$
$$
\begin{align}
\mathbf{A} &= \mathbf{Q} \, \mathbf{R} \\
% A
\left[
\begin{array}{ccc}
a & 0 & 0 \\
0 & a & 0 \\
0 & 0 & a \\
b & 0 & 0 \\
0 & b & 0 \\
0 & 0 & b \\
\end{array}
\right]
%
&=
% Q
\left( a^{2}+b^{2} \right)^{-\frac{1}{2}}
\left[
\begin{array}{ccc}
a & 0 & 0 \\
0 & a & 0 \\
0 & 0 & a \\
b & 0 & 0 \\
0 & b & 0 \\
0 & 0 & b \\
\end{array}
\right]
% R
\left( a^{2}+b^{2} \right)^{\frac{1}{2}}
\left[
\begin{array}{ccc}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1 \\
\end{array}
\right]
%
\end{align}
$$
Example SVD
The singular value decomposition will use both $\color{blue}{range}$ and $\color{red}{null}$ spaces.
$$
\begin{align}
\mathbf{A} &=
\left[ \begin{array}{c|c}
\color{blue}{\mathbf{U}_{\mathcal{R}}} &
\color{red} {\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
\, \Sigma \, \color{blue}{\mathbf{V}^{*}} \\
%
&=
% U
\left( a^{2}+b^{2} \right)^{-\frac{1}{2}}
\left[
\begin{array}{ccc|ccc}
\color{blue}{0} & \color{blue}{0} & \color{blue}{a} & \color{red}{0} & \color{red}{0} & \color{red}{-a} \\
\color{blue}{0} & \color{blue}{a} & \color{blue}{0} & \color{red}{0} & \color{red}{-a} & \color{red}{0} \\
\color{blue}{a} & \color{blue}{0} & \color{blue}{0} & \color{red}{-a} & \color{red}{0} & \color{red}{0} \\
\color{blue}{0} & \color{blue}{0} & \color{blue}{b} & \color{red}{0} & \color{red}{0} & \color{red}{b} \\
\color{blue}{0} & \color{blue}{b} & \color{blue}{0} & \color{red}{0} & \color{red}{b} & \color{red}{0} \\
\color{blue}{b} & \color{blue}{0} & \color{blue}{0} & \color{red}{b} & \color{red}{0} & \color{red}{0} \\
\end{array}
\right]
% S
\left( a^{2}+b^{2} \right)^{\frac{1}{2}}
\left[
\begin{array}{ccc}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1 \\\hline
0 & 0 & 0 \\
0 & 0 & 0 \\
0 & 0 & 0 \\
\end{array}
\right]
% V
\left[
\begin{array}{ccc}
\color{blue}{0} & \color{blue}{0} & \color{blue}{1} \\
\color{blue}{0} & \color{blue}{1} & \color{blue}{0} \\
\color{blue}{1} & \color{blue}{0} & \color{blue}{0} \\
\end{array}
\right]
%
\end{align}
$$
The pseudoinverse matrix is
$$
\begin{align}
\mathbf{A}^{+} &=
\color{blue}{\mathbf{V}} \,
\Sigma^{+}
\left[ \begin{array}{c}
\color{blue}{\mathbf{U}_{\mathcal{R}}^{*}} \\
\color{red} {\mathbf{U}_{\mathcal{N}}^{*}}
\end{array} \right]
%
=
\left( a^{2}+b^{2} \right)^{\frac{1}{2}}
\left[
\begin{array}{cccccc}
a & 0 & 0 & b & 0 & 0 \\
0 & a & 0 & 0 & b & 0 \\
0 & 0 & a & 0 & 0 & b \\
\end{array}
\right]
%
\end{align}
$$
Derivation $\mathbf{Q}\, \mathbf{R}$
The $\mathbf{Q}\, \mathbf{R}$ is computationally cheaper than the SVD. It provides an orthonormal basis for the column space. The column vectors of the target matrix are already orthogonal; they just need normalization:
$$
\mathbf{Q} = \left( a^{2} + b^{2} \right)^{-\frac{1}{2}} \mathbf{A}.
$$
We can bypass the usual Gram-Schmidt process.
The $\mathbf{R}$ matrix is upper-triangular. The the supradiagonal terms describe the projections of the column vectors of $\mathbf{Q}$ on the column vectors of $\mathbf{A}$. For example
$$
\mathbf{R}_{1,2} = \mathbf{Q}^{*}_{1} \mathbf{A}_{2}
$$
Because the columns or $\mathbf{A}$ were already orthogonal, the supradiagonal terms are $0$. The diagonal terms hold the lengths of the column vectors of $\mathbf{A}$
$$
\mathbf{Q}_{k,k} = \lVert \mathbf{A}_{k} \rVert, \quad k = 1, n.
$$
The target matrix $\mathbf{A}$ has column vectors of uniform length $\sqrt{a^{2} + b^{2}}$.
Derivation SVD
The SVD starts by resolving the eigensystem of the product matrix:
$$
\mathbf{A}^{*} \mathbf{A} =
\left( a^{2} + b^{2} \right)
\left[
\begin{array}{ccc}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1 \\
\end{array}
\right]
$$
The eigenvalue spectrum is
$$
\lambda_{k} = a^{2} + b^{2}, \quad k = 1, n
$$
There are no zero eigenvalues. There is no need to order the spectrum. We can harvest the singular values directly
$$
\sigma_{k} = \sqrt{\lambda_{k}\left( \mathbf{A}^{*} \mathbf{A} \right)}
= \sqrt{ a^{2} + b^{2} }, \quad k=1,n
$$
The algebraic multiplicity $(n)$ matches the geometric multiplicity for the eigenvalue. For the eigenvectors we choose the simplest set, the unit vectors in the identity matrix. Using the relationship
$$
\mathbf{A} \, \mathbf{V} = \mathbf{U} \, \Sigma
$$
we find that
$$
\color{blue}{\mathbf{U}_{\mathcal{R}}} = \left( a^{2} + b^{2} \right)^{-\frac{1}{2}} \mathbf{A}
$$
To summarize the SVD
$$
\mathbf{V} = \mathbf{I}_{n}, \quad \Sigma = \left( a^{2} + b^{2} \right)^{\frac{1}{2}}
\left[ \begin{array}{c}
\mathbf{I}_{n} \\
\mathbf{0}
\end{array} \right], \quad
\color{blue}{\mathbf{U}_{\mathcal{R}}} = \left( a^{2} + b^{2} \right)^{-\frac{1}{2}} \mathbf{A}
$$
The psuedoinverse
$$
\mathbf{A} = \mathbf{V} \, \Sigma^{+} \, \color{blue}{\mathbf{U}_{\mathcal{R}}} = \left( a^{2} + b^{2} \right)^{\frac{1}{2}} \mathbf{A}^{*}
$$