You can partition the matrix into $ \begin{bmatrix} 0 & v^\top \\ v & A \end{bmatrix} $ where $A$ is a symmetric $2\times2$ matrix and $v$ a $2\times1$ vector, yielding the solution of $ \begin{bmatrix} 0 & v^\top \\ v & A \end{bmatrix}^{-1} = \begin{bmatrix} -(v^\top A^{-1} v)^{-1} & T^\top \\ T & A^{-1}-T v^\top A^{-1} \end{bmatrix} $ where $T = A^{-1} v (v^\top A^{-1} v)^{-1} $ is the weighted pseudo inverse of $v$, since $T^\top v=1$ and $v^\top T=1 $
Proof: $ \begin{bmatrix} -(v^\top A^{-1} v)^{-1} & T^\top \\ T & A^{-1}-T v^\top A^{-1} \end{bmatrix} \begin{bmatrix} 0 & v^\top \\ v & A \end{bmatrix} = \begin{bmatrix} T^\top v & -(v^\top A^{-1} v)^{-1} v^\top+T^\top A \\ A^{-1}v-T v^\top A^{-1} v & T v^\top + A^{-1} A - T v^\top \end{bmatrix} $
with the simplification $T^\top A = (v^\top A^{-1} v)^{-1} v^\top$ the above becomes
$ \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} $
Example:
$A = \begin{pmatrix} \lambda & -1 \\ -1 & 0 \end{pmatrix} $, $v = \begin{pmatrix} 0 \\ \lambda \end{pmatrix}$, $T = \begin{pmatrix} \lambda & -1 \\ -1 & 0 \end{pmatrix}^{-1} \begin{pmatrix} 0 \\ \lambda \end{pmatrix} \left(\begin{pmatrix} 0 & \lambda \end{pmatrix} \begin{pmatrix} \lambda & -1 \\ -1 & 0 \end{pmatrix}^{-1} \begin{pmatrix} 0 \\ \lambda \end{pmatrix} \right)^{-1} = \begin{pmatrix} \frac{1}{\lambda^2} \\ \frac{1}{\lambda} \end{pmatrix}$
$ \begin{bmatrix} -(v^\top A^{-1} v)^{-1} & T^\top \\ T & A^{-1}-T v^\top A^{-1} \end{bmatrix} = \begin{bmatrix} \begin{pmatrix} \frac{1}{\lambda^3} \end{pmatrix} & \begin{pmatrix} \frac{1}{\lambda^2} & \frac{1}{\lambda} \end{pmatrix} \\ \begin{pmatrix} \frac{1}{\lambda^2} \\ \frac{1}{\lambda} \end{pmatrix} & \begin{pmatrix} \frac{1}{\lambda} & 0 \\ 0 & 0 \end{pmatrix} \end{bmatrix} $