Let's say I have a point $\mathbf{x}$ in $n$-dimensional space. For any basis $(\mathbf{u}_1, ..., \mathbf{u}_n)$, $\mathbf{x}$ can be written as a linear combination of this basis.
$\mathbf{x} = x_1 \mathbf{u}_1 + x_2 \mathbf{u}_2 + ... + x_n \mathbf{u}_n$ where each $x_i$ is a projection of $\mathbf{x}$ onto $\mathbf{u}_i$.
Now I want to generalize it to a matrix $\mathbf{X}$ in $\mathbb{R}^{m\times n}$. The Singular Value Decomposition (SVD) guarantees that any matrx $\mathbf{X}$ of rank $r$ can be written as $\mathbf{X} = \sum_{i=1}^r \sigma_i \mathbf{u}_i \mathbf{v}^T_i$ where $\mathbf{u}_1, ..., \mathbf{u}_r\in\mathbb{R}^m$ are orthonormal (each has length 1 and every pair is orthogonal) and $\mathbf{v}_1, ..., \mathbf{v}_r\in\mathbb{R}^n$ are also orthonormal. Each pair $\mathbf{u}_i$ and $\mathbf{v}_i$ form a pair of left and right singular vectgors with singular value $\sigma_i$.
Note that $\mathbf{u}_1, ..., \mathbf{u}_r$ is not the only orthonormal vector set in $\mathbb{R}^m$. In fact, any $r$ vectors picked from an orthonormal basis \mathbf{u}'_1, ..., \mathbf{u}'_m\in\mathbb{R}^m can be a candidate. The same holds for the right singular vectors $\mathbf{v}_i$'s.
Then the question is, can I write up the SVD-like decomposition using every orthonormal basis other than left / right singular vectors? (Then the SVD can be regarded as a special case of this composition which uses left / right singular vectors.) I mean can I write something like $\mathbf{X} = \sum_{i=1}^r \alpha_i \mathbf{s}_i \mathbf{t}^T_i$ for every orthonormal vectors $\mathbf{s}_1, ..., \mathbf{s}_r\in\mathbb{R}^m$ and $\mathbf{t}_1, ..., \mathbf{t}_r\in\mathbb{R}^n$? If that's possible, how can I compute such $\alpha_i$'s?