1
$\begingroup$

How can I solve the following equation? by which algorithm?

$\min\limits_{B}\ \lVert ABC-AC\rVert_F^2+\lambda\lVert B\rVert_{2,1}$

s.t. $B_{ii}=0$ (Diagonal elements of $B$ must be zero)

where:

$\lVert .\rVert_F^2 \ $ denotes the power 2 of the frobenius norm

$\lVert .\rVert_{2,1}$ represents $L_{2,1}$-norm

$A$ is an $k\times m$ matrix ($A$ is given)

$B$ is an $m\times m$ matrix ($B$ should be found)

$C$ is an $m\times l$ matrix ($C$ is given)

$\lambda$ is a scalar ($\lambda$ is given)

I know the derivative of the equation with respect to $B$ is:

$2A^{\top}ABCC^{\top}-2A^{\top}ACC^{\top}+\frac{\lambda}{2}DB$

where $D$ is an $m\times m$ diagonal matrix, so that $D_{ii}=\frac{1}{2\lVert B_i \rVert_2}$ and $\lVert B_i \rVert_2$ is the norm-2 of $i^{th}$ row of the matrix $B$, and the gradient descent rule for updating $B$ is:

$B^{n+1}= B^{n} - \eta (A^{\top}AB^n CC^{\top}-A^{\top}ACC^{\top}+\lambda D^nB^n)$

$B^{n+1}_{ii} = 0$

This algorithm is very sensitive to the initial value of $B$ and the learning rate parameter $\eta$. Also, the rate of convergence is very slow.

Is there any other algorithm to solve the equation?

0 Answers 0