Let $X \in \mathbb{R}^{m\times n}$ be a matrix with rank $r$.
How can we find the optimal $\tilde{X} \in \mathbb{R}^{m\times n}$ whose rank is $k$ where $k\leq r$ and the reconstruction error in terms of maximum absolute column sum norm is minimized? That is
$e=\|X-\tilde{X}\|_1 = \max_j \sum_i|X_{ij}-\tilde{X}_{ij}|$
is minimized.
I know that the optimal solution is the $k$-reduced SVD when Frobenius-norm or spectral norm is minimized. But I'm not familiar with optimization related to $l_1$-norm and I want to use it to find sparse bases to use in an audio processing project.
Is there an analytic closed form solution for this problem? Or can I find the $k$-rank approximation in an iterative numerical method?