1
$\begingroup$

Let $f: \mathbb{R}^n \rightarrow \mathbb{R}$ be a strictly convex function. Suppose $x^*$ is its unique minimum. One can show that for any $x \in \mathbb{R}^n$, the scalar product $\langle \frac{\partial f}{\partial x}, x^* - x \rangle$ is strictly negative whenever $x \ne x^*$. Let $\tilde{x} = x^* - x$. Suppose also that $\frac{\partial f}{\partial x}$ depends linearly on $x$.

Is is possible to show that there exist some constants $\alpha, \beta$ such that $\langle \frac{\partial f}{\partial x}, x^* - x \rangle \le - \alpha \| \tilde{x} \|^{\beta}$, and, in particular, $\beta=2$ if the gradient depends linearly on $x$?

1 Answers 1

1

Without loss of generality $x^\ast = 0$. If $\frac{\partial f}{\partial x}$ depends linearly on $x$ as you assume, then it has the form $Ax$ where $A$ is symmetric. Thus $f(x) = \frac{1}{2}x^TAx$. Since $f$ is strictly convex, $A$ is positive definite. Then $$ \langle \frac{\partial f}{\partial x}, -x \rangle = -2 x^TAx \le - \alpha \|x\|^2 $$ where $\alpha$ is the smallest eigenvalue of $A$.