I've read that you should avoid computing a matrix inverse, as you generally don't need to, but I don't know the best way to avoid it. I need to compute:
$x = \mathbf v \mathbf A^{-1}\mathbf v^\top$
where $x$ is a scalar, $\mathbf v$ is a row vector, $\mathbf A$ is a symmetric positive definite matrix (but perhaps with eigenvalues close to $0$) and ${}^\top$ means transpose.
I'm using numpy/scipy
so feel free to express an answer using their functions.
EDIT:
Any pros/cons of the least squares approach versus doing an eigenvector decomposition?