3
$\begingroup$

In the Tikhonov regularization problem, $\Vert Ax-b\Vert^{2}+\Vert\Gamma x\Vert^{2}$, with $\Gamma=\alpha I$ .The solution from SVD is $x=VDU^\top$, where $A=U\Sigma V^\top$ and $D_{ii}=\dfrac{\sigma_i}{\sigma_i^2+\alpha^2}$, with $\sigma_i$ given by the singular values from $\Sigma$.

The question? Is there a similar solution (in terms of SVD) when $\Gamma=\alpha_{i}I$, $i=1,...,$ number of rows.

  • 0
    I do$n$t understand the solution can you be more specific to explain.2011-09-20

2 Answers 2

2

Tikhonov regularization in its most general form is the solution of the problem

$\min_{\mathbf x}\|\mathbf A\mathbf x-\mathbf b\|^2+\alpha^2\|\mathbf L\mathbf x\|^2$

or the problem

$\min_{\mathbf x}\left\|\begin{pmatrix}\mathbf A\\ \alpha \mathbf L\end{pmatrix}\mathbf x-\begin{pmatrix}\mathbf b\\ \mathbf 0\end{pmatrix}\right\|$

Lars Elden gave a method for converting this general Tikhonov problem into an equivalent "standard form" regularization problem. I assume in the sequel that $\mathbf L$ is invertible; for singular $\mathbf L$ (and in fact for rectangular $\mathbf L$ as well), refer to Elden's paper for the required transformations (which involves the use of the QR decomposition).

In particular, if we let $\mathbf y=\mathbf L\mathbf x$, one can see that an equivalent standard Tikhonov problem is

$\min_{\mathbf x}\|\mathbf A\mathbf L^{-1}\mathbf y-\mathbf b\|^2+\alpha^2\|\mathbf y\|^2$

from which you can use the usual formulae for Tikhonov regularization to solve for $\mathbf y$. From this, one obtains the solution to the general problem as $\mathbf x=\mathbf L^{-1}\mathbf y$.

  • 1
    As I said, you can now use the formula you gave. The only difference is that youneed to compute the SVD of $\mathbf A\mathbf L^{-1}$ instead of just $\mathbf A$ (and it is easy to invert a diagonal matrix). After all the Tikhonov machinery, multiply the result with $\mathbf L^{-1}$.2011-09-21
0

Observe that

$\min_x\left\|b - Ax \right\|_2 + \left\|\Gamma x\right\|_2= \min_{x} \left\| \left[ \begin{array}{c} b \\ 0 \end{array} \right] - \left[\begin{array}{c} A \\ \Gamma \end{array}\right]x\right\|_2$

which is your original problem reformulated as a least squares problem. $x$ is a solution of the minimization problem if and only if it satisfies the normal equations $\left(A^TA + \Gamma^T \Gamma\right) x = A^T b$. This is true for any $\Gamma$. For $\Gamma$ of full column rank we have a unique solution $x = \left(A^TA + \Gamma^T \Gamma\right)^{-1} A^Tb$. Using this formula you can invoke the SVD of $A$ and study the effect of your regularization matrix $\Gamma^T \Gamma$ for various structures of $\Gamma$.