Consider the simple optimization problem $$\min_x \|b-Ax\|_2 \ \ \ \ \ \ \text{subject to} \ \ \ \ \ \ \|x\| = 1$$ where $b\in\mathbb{R}^{N}$, $A\in\mathbb{R}^{N\times 3}$, $x\in\mathbb{R}^{3}$ with $N > 3$ (overdetermined system). I need to solve this millions of times, so computational efficiency is important.
I am aware of the following aspects:
- The norm equality constraint $\|x\| = 1$ makes the problem non-convex.
- A relaxation to $\|x\| \leq 1$ leads to a convex optimization problem.
- I can employ a two-dimensional parametrization of $\mathbb{R}^{3}$ unit vector $x$, e.g. azimuth angle $\phi$ and polar angle $\theta$, and use a gradient search (or similar) to solve $\min_{\phi,\theta} \|b-Ax(\phi,\theta)\|_2$ instead.
- The problem can be written in quadratic fashion $$\min_x \ x^\text{T}A^\text{T}Ax - 2b^\text{T}Ax \ \ \ \ \ \ \text{s.t.} \ \ \ \ \ \ x^\text{T}x = 1 \ .$$
My questions are:
- Is there a super smart and super efficient way to solve my problem? If yes, a minimal Matlab code example that does so would be amazing.
- Is "quadratic progamming" eligible?
- Does a relaxation to $\|x\| \geq 1$ also lead to a convex problem? Would it be meaningful to first solve s.t. $\|x\| \leq 1$ and then s.t. $\|x\| \geq 1$ in hopes that one of the two solutions fulfills $\|x\| = 1$?
- A minimal Matlab example for the relaxed problem s.t. $\|x\| \leq 1$ would be amazing (this should be easy, but I never used CVX or similar).
- I found that a normalized pseudo-inverse solution, i.e. $x \approx y\,/\,\|y\|$ with $y = (A^\text{T}A)^{-1}A^\text{T} b$, is a decent approximation quite often. This can be the case even if $\|y\|$ is far off $1$. Can you identify a condition for this approximation being accurate/meaningful?
Info: I never really studied the fine arts of convex optimization and linear/quadratic programming. So, when writing your response, please do not omit details that might be obvious to experts.
Thank you very much!