0
$\begingroup$

my question is about gradient algorithms.

Lets have function f like: $f(x) = \|Ax-b\|^2$ and i want to find its minimum (according to x). So i can use some gradient method, for instance gradient descent (see http://en.wikipedia.org/wiki/Gradient_descent).

I would like to know, if there is algorithm solving above expression with parameter A: min $f(A) = \|Ax-b\|^2$ (or some other functions like $\|x_i-x_j\|_A = \sqrt{(x_i-x_j)^TA(x_i-x_j)}$). And what would it look like.

I could only find solutions with argument x. Thanks.

EDIT: For clarification i make an example. From wiki, there exist solution for this problem: $f(x) = \|Ax-b\|^2$ We can use (for instance) conjugate gradient algorithm as follows (from wiki):

  1. $\mathbf{r}_0 := \mathbf{b} - \mathbf{A x}_0 \,$
  2. $\mathbf{p}_0 := \mathbf{r}_0 \,$
  3. $k := 0 \, $
  4. repeat
    • $\alpha_k := \frac{\mathbf{r}_k^\mathrm{T} \mathbf{r}_k}{\mathbf{p}_k^\mathrm{T} \mathbf{A p}_k} \, $
    • $\mathbf{x}_{k+1} := \mathbf{x}_k + \alpha_k \mathbf{p}_k \, $
    • $\mathbf{r}_{k+1} := \mathbf{r}_k - \alpha_k \mathbf{A p}_k \, $
    • if rk+1 is sufficiently small then exit loop end if
    • $\beta_k := \frac{\mathbf{r}_{k+1}^\mathrm{T} \mathbf{r}_{k+1}}{\mathbf{r}_k^\mathrm{T} \mathbf{r}_k} \, $
    • $\mathbf{p}_{k+1} := \mathbf{r}_{k+1} + \beta_k \mathbf{p}_k \,$ $k := k + 1 \, $
  5. end repeat

Then my question is, what if the unknown parameter is A, not x (A is number matrix, x,b are column vectors). How would be defined the problem and how would change the algorithm?

2 Answers 2

2

If I understand your question, you are asking how to write the linear map $A \mapsto A x$ in the usual manner of a matrix multiplied by a parameter vector.

Represent $A$ by a vector $\hat A \in \mathbb{R}^{n^2}$, where $\hat A =(A_{1,1},...,A_{1,n},A_{2,1},...,A_{n,1},...,A_{n,n})^T$, and let $\hat x \in \mathbb{R}^{n \times {n^2}}$ be $\hat x = \left( \begin{array}{cccc} x^T & 0 & ... & 0\\ 0 & x^T & ... & 0 \\ \vdots& & & \vdots \\ 0 & 0 & ... & x^T \end{array} \right)$

then you can apply your gradient method to solve $\min_A ||\hat x\hat A - b ||^2$.

  • 0
    Oh, i'm sorry, i couldn't see it yesterday, but i get it now! This can be applicable, thank you very much :]2012-04-24
0

It's called Weighted least squares.

  • 0
    Thank you for answer. I looked at it but im not sure if this solves my question. I made some clarification in the comment above.2012-04-23