5
$\begingroup$

My class has recently learnt the BFGS method for unconstrained optimisation. In this procedure, we have a rank-1 update to a positive definite matrix at each step.

This is specified as:

$H_{k+1} = H_k + \frac{\eta\eta^T}{\delta^T\eta}-\frac{H_k\delta\delta^TH_k^T}{\delta^TH_k\delta}$

$\forall \eta, \delta \in \mathbb{R}^n$.

Show that for any symmetric positive definite matrix $H_k,$ we have that $H_{k+1}$ is positive definite so long as $\delta^T\eta > 0$. Don't assume anything about $H_k$ other than the fact that it is symmetric p.d.

1 Answers 1

4

$\def\skp#1{\left<#1\right>}$As $H_k$ is spd, the form $(x,y) \mapsto \skp{x,y}_k := x^tH_ky$ defines a scalar product. By Cauchy-Schwarz we have \[ \skp{x,y}_k^2 \le \skp{x,x}_k\skp{y,y}_k \] for any $x,y \in \mathbb R^n$ and with strict inequality if $x$ and $y$ aren't linear dependent. Now let $x \in \mathbb R^n\setminus \{0\}$, then \begin{align*} \skp{x,x}_{k+1} &= x^tH_{k+1}x\\ &= x^tH_k x + x^t\frac{\eta\eta^t}{\delta^t\eta}x - x^t\frac{H_k\delta\delta^tH_k^t}{\delta^tH_k\delta}x\\ &= \skp{x,x}_k + \frac 1{\delta^t\eta}x^t\eta\eta^tx - \frac 1{\skp{\delta, \delta}_k}\skp{x,\delta}_k\skp{\delta, x}_k\\ &= \skp{x,x}_k + \frac 1{\delta^t\eta}(x^t\eta)^2 - \frac 1{\skp{\delta, \delta}_k}\skp{x,\delta}_k^2\\ \end{align*} If now $x$ and $\delta$ aren't linear dependent, then we can estimate further using Cauchy-Schwarz: \begin{align*} \skp{x,x}_{k+1} &> \skp{x,x}_k + 0 - \frac 1{\skp{\delta,\delta}_k}\skp{x,x}_k\skp{\delta,\delta}_k\\ &= 0. \end{align*} Otherwise, if, say $x = \lambda\delta$, then $x^t\eta = \lambda \delta^t\eta \ne 0$, and, hence \begin{align*} \skp{x,x}_{k+1} &= \skp{x,x}_k + \frac 1{\delta^t\eta}(x^t\eta)^2 - \frac 1{\skp{\delta,\delta}_k}\skp{x,x}_k\skp{\delta,\delta}_k\\ &= \frac 1{\delta^t\eta}(x^t\eta)^2\\ &> 0. \end{align*} So $\skp{x,x}_{k+1} > 0$ for $x \ne 0$, and $H_{k+1}$ is positive definite.