I'll use primes to denote "transpose" to make typing easier.
Assume $v$ is a unit vector. Pick $w_1, \ldots, w_{n-1}$ so that $\{w_1, w_2, \ldots, w_{n-1}, v\}$ is an orthonormal basis, and let $W$ be the matrix containing these vectors as its columns. Then
$$
(W' M W)' = W' M' W'' = W' M W
$$
so $W' M W$ is symmetric. Also,
$$W'PW = W' (I - v v') W = W'IW - W'v v'W = I - (W'v)(W'v)'
$$
so $W' P W$ is also a projection matrix, with projection vector $W'v$.
Now since $W' v = e_n$, this means that $W'PW$ is the $n \times n$ identity, except that the bottom right entry is a zero.
Conclusion of this initial discussion: the problem for $P$ and $M$ can be converted, by a change of basis, to a similar problem in which $v$ is just $e_n$ and $M$ is some other symmetric matrix.
So: we'll assume $v = e_n$ from here on, so that $P$ is simply that identity matrix with the bottom right entry set to $0$.
Now let's write the matrix $M$ in the form
$$
\begin{bmatrix}
& & & & | \\
& & N & & a \\
& & & & | \\
& - & a'& - & c
\end{bmatrix}
$$
where this is meant to indicate that the matrix $N$ is $(n-1) \times (n-1)$, the right-hand column of $M$ consists of the $n-1$-vector $a$, with the last entry $b$, and the bottom row is just the transpose of this.
Since $N$ is positive-definite symmetric, it's diagonalizable, with unit orthogonal eigenvectors $u_1, \ldots, u_n \in \Bbb R^{n-1}$, with positive eigenvalues $\lambda_1, \ldots, \lambda_{n-1}$. Letting
$$
i : \Bbb R^{n-1} \to \Bbb R^n :
\begin{bmatrix}x_1\\ \ldots \\x_{n-1} \end{bmatrix} \mapsto
\begin{bmatrix}x_1\\ \ldots \\x_{n-1} \\0\end{bmatrix}
$$
we get vectors $t_j = i(u_j)$ for $j = 1, \ldots, n$.
What is $Mt_j$? It's just $Nu_j$ atop $a \cdot t_j$, i.e., it's $\lambda_j u_j$ extended by something as a last entry. Hold that thought.
Now let $U$ be a matrix whose columns are $t_1, \ldots, t_{n-1}, e_n$. $U$ is evidently orthogonal. Let's compute
$$
U' PM U
$$
by starting from $MU$ (whose $j$th column is $Nu_j$ atop $a \cdot t_j$, except for the $n$th column, which is $Me_n$, which is $a$ atop $c$. Mutiplying by $P$ sets the last entry to $0$, so we end up with a matrix
$$
PMU = \begin{bmatrix}
| & & | & | \\
\lambda_1 u_1 & \ldots & \lambda_{n-1} u_{n-1} & a \\
| & & | & | \\
0 & \ldots & 0 & 0
\end{bmatrix}
$$
Multiplying that by $U'$ gives a diagonal upper left block, $0$s along the bottom, and some dot-products in the right hand column and hence the general form
$$
U'PMU = \begin{bmatrix}
\lambda_1 & & & u_1 \cdot a \\
& \ddots & & \vdots \\
& & \lambda_{n-1} & u_{n-1} \cdot a \\
0 & \ldots & 0 & 0
\end{bmatrix}
$$
The characteristic polynomial of that matrix is
$$
c(t) = (\lambda_1 - t)\cdots (\lambda_{n-1} - t)(t)
$$
with associated eigenvectors $e_1, \dots, e_{n-1}, q$, where $q$ is the eigenvector for $0$ and isn't easy to write down off the top of my head.
I think it's clear that $q$ has a nonzero $n$th-component, though.
Whew! Now we know that in the $U$ basis, $PM$ has a full complement of eigenvectors, and hence $PM$ is diagonalizable (In the sense that we
can form the matrix $E$ containing the eigenvectors and write
$$
E^{-1} PM E
$$
to get a diagonal matrix.)
On the other hand, unless all the $a \cdot u_j$ entries are zero, the matrix $PM$ is not orthogonally diagonalizable, i.e., there's no orthogonal matrix $F$ with
$$
F' PM F
$$
diagonal, for if there were, it would have to consist of eigenvectors of $PM$, which would have to be orthgonal, which would require that all the $a \cdot u_j$ entries be zeroes.
Note that since the $\lambda$s are all positive, and 0 is nonnegative, this matrix actually does have a square root as well.