0
$\begingroup$

enter image description here

Additional context: $H = |δ^2f / δx_iδx_j|$ is the Hessian matrix. $(3)$

enter image description here

From my previous question: What are the functionality of δ symbol and $δr^T$?,

I got a few questions:

  1. I have read more about the Hessian Matrix and I got a general idea that it can be used to see rate of change in a function by using 2nd order partial derivatives. But from the link above, I see a function $f(x_1, x_2, ..., x_n)$. From what I understand is that we are dealing with multiple variables in one dimension. However, if I am to use this in image processing which I got rows and columns, for instance, $x_1$ to $x_n$ and $y_1$ to $y_n$, how should I approach Hessian Matrix in this case? From the additional context $(3)$ above and from wikipedia itself, can I simply "plug in" $x_n$ to $x_j$ and $y_n$ to $x_i$, correct? I just want to be very sure here. This is how it should be, yes?

  2. In the equation (1) in the snapshot, what is the definition for gradient $∇_0$ in this case? I have tried read it from here, but I am not sure what should I be looking for.

  3. From my previous question, I see that the letter T stands for Transpose in Matrices, however, from the description above in the image, I understand that $r_0$ is just a point and $δr$ is a small rate of change to $r_0$. Shouldn't rate of change $δr$ be just a scalar? How can we transpose $δr$ in this case? Or am I missing something here?

If I appear to make any mistake above, please correct me.

  • 0
    The Taylor expansion in the book is wrong. The second-order term should be $\frac{1}{2}\delta r^TH_0\delta r$ where $H_0$ is the hessian at $r=r_0$2018-04-29

1 Answers 1

2
  1. Yes. $f(x_1,\dots,x_n,y_1,\dots,y_n)$ is simply a function of $2n$ variables.

  2. $\nabla_0$ means the gradient of the function $I$, evaluated at the point $\mathbf{r}_0$ (as it says in the text).

  3. $\mathbf{r}_0$ (and $\delta \mathbf{r}$ too) is viewed as a vector in $\mathbf{R}^n$, if $I$ is a function of $n$ variables.

  • 0
    @Karl: Sorry if I was unclear. I thought that if you knew what the Hessian is, you would already be familiar with the gradient (which is the $n \times 1$ matrix of first partial derivatives, just like the Hessian is the $n \times n$ matrix of second partials).2011-08-21