my question is about gradient algorithms.
Lets have function f like: $f(x) = \|Ax-b\|^2$ and i want to find its minimum (according to x). So i can use some gradient method, for instance gradient descent (see http://en.wikipedia.org/wiki/Gradient_descent).
I would like to know, if there is algorithm solving above expression with parameter A: min $f(A) = \|Ax-b\|^2$ (or some other functions like $\|x_i-x_j\|_A = \sqrt{(x_i-x_j)^TA(x_i-x_j)}$). And what would it look like.
I could only find solutions with argument x. Thanks.
EDIT: For clarification i make an example. From wiki, there exist solution for this problem: $f(x) = \|Ax-b\|^2$ We can use (for instance) conjugate gradient algorithm as follows (from wiki):
- $\mathbf{r}_0 := \mathbf{b} - \mathbf{A x}_0 \,$
- $\mathbf{p}_0 := \mathbf{r}_0 \,$
- $k := 0 \, $
- repeat
- $\alpha_k := \frac{\mathbf{r}_k^\mathrm{T} \mathbf{r}_k}{\mathbf{p}_k^\mathrm{T} \mathbf{A p}_k} \, $
- $\mathbf{x}_{k+1} := \mathbf{x}_k + \alpha_k \mathbf{p}_k \, $
- $\mathbf{r}_{k+1} := \mathbf{r}_k - \alpha_k \mathbf{A p}_k \, $
- if rk+1 is sufficiently small then exit loop end if
- $\beta_k := \frac{\mathbf{r}_{k+1}^\mathrm{T} \mathbf{r}_{k+1}}{\mathbf{r}_k^\mathrm{T} \mathbf{r}_k} \, $
- $\mathbf{p}_{k+1} := \mathbf{r}_{k+1} + \beta_k \mathbf{p}_k \,$ $k := k + 1 \, $
- end repeat
Then my question is, what if the unknown parameter is A, not x (A is number matrix, x,b are column vectors). How would be defined the problem and how would change the algorithm?