As requested, here is the answer I hinted to in my comment:
Theorem 1. Let $n\in\mathbb{N}$. Let $\mathbb{K}$ be a commutative ring.
Let $a_{1},a_{2},\ldots,a_{n}$ be $n$ elements of $\mathbb{K}$. Let
$x\in\mathbb{K}$. Let $A\in\mathbb{K}^{n\times n}$ be the $n\times n$-matrix
$\left(
\begin{array}
[c]{ccccc}
a_{1} & x & x & \cdots & x\\
x & a_{2} & x & \cdots & x\\
x & x & a_{3} & \cdots & x\\
\vdots & \vdots & \vdots & \ddots & \vdots\\
x & x & x & \cdots & a_{n}
\end{array}
\right) $ (this is the $n\times n$-matrix whose diagonal entries are
$a_{1},a_{2},\ldots,a_{n}$, while all its other entries are $x$). For each
$i\in\left\{ 1,2,\ldots,n\right\} $, set $b_{i}=\prod\limits_{\substack{k\in
\left\{ 1,2,\ldots,n\right\} ;\\k\neq i}}\left( a_{k}-x\right) $. Then,
$\det A=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +x\sum\limits_{i=1}^{n}b_{i}$.
To prove this, we recall the following fact (known as the matrix determinant
lemma,
although it would not surprise me if it has several contenders for its rather
generic name):
Lemma 2. Let $n\in\mathbb{N}$. Let $\mathbb{K}$ be a commutative ring. Let
$\mathbf{A}\in\mathbb{K}^{n\times n}$ be an $n\times n$-matrix. Let
$\mathbf{u}\in\mathbb{K}^{n\times1}$ and $\mathbf{v}\in\mathbb{K}^{n\times1}$
be two column vectors. Then,
$\det\left( \mathbf{A}+\mathbf{uv}^{T}\right) =\det\mathbf{A}+\mathbf{v}
^{T}\left( \operatorname*{adj}\mathbf{A}\right) \mathbf{u}$.
(Here and in the following, $\operatorname*{adj}\mathbf{A}$ denotes the
adjugate of $\mathbf{A}$.)
See Matrix determinant lemma with adjugate matrix for a proof of Lemma 2.
We also shall use the following simple fact:
Lemma 3. Let $n\in\mathbb{N}$. Let $\mathbb{K}$ be a commutative ring. Let
$a_{1},a_{2},\ldots,a_{n}$ be $n$ elements of $\mathbb{K}$. For each
$i\in\left\{ 1,2,\ldots,n\right\} $, set $b_{i}=\prod\limits_{\substack{k\in
\left\{ 1,2,\ldots,n\right\} ;\\k\neq i}}a_{k}$. Then,
$\operatorname*{adj}\left( \left( a_{i}\delta_{i,j}\right) _{1\leq i\leq
n,\ 1\leq j\leq n}\right) =\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq
n,\ 1\leq j\leq n}$.
(Here, as usual, $\delta_{i,j}$ is the Kronecker delta of $i$ and $j$.)
Proof of Lemma 3. Lemma 3 simply says that the adjugate of the diagonal
matrix with diagonal entries $a_{1},a_{2},\ldots,a_{n}$ is the diagonal matrix
with diagonal entries $b_{1},b_{2},\ldots,b_{n}$. This is easy to check (since
each $\left( n-1\right) \times\left( n-1\right) $-submatrix of a diagonal
matrix either is a diagonal matrix itself, or has a zero row).
Now, we can prove Theorem 1:
Proof of Theorem 1. Let $\mathbf{A}$ be the $n\times n$-matrix $\left(
\left( a_{i}-x\right) \delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}
$. (This $\mathbf{A}$ is the diagonal matrix with diagonal entries
$a_{1}-x,a_{2}-x,\ldots,a_{n}-x$). Thus, $\det\mathbf{A}=\prod\limits_{i=1}
^{n}\left( a_{i}-x\right) $ (since the determinant of a diagonal matrix is
the product of its diagonal entries). But Lemma 3 (applied to $a_{1}
-x,a_{2}-x,\ldots,a_{n}-x$ instead of $a_{1},a_{2},\ldots,a_{n}$) yields that
$\operatorname*{adj}\left( \left( \left( a_{i}-x\right) \delta
_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\left( b_{i}
\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}$.
Since $\left( \left( a_{i}-x\right) \delta_{i,j}\right) _{1\leq i\leq
n,\ 1\leq j\leq n}=\mathbf{A}$, this rewrites as
$\operatorname*{adj}\mathbf{A}=\left( b_{i}\delta_{i,j}\right)
_{1\leq i\leq n,\ 1\leq j\leq n}$.
Let $\mathbf{v}\in\mathbb{K}^{n\times1}$ be the column vector $\left(
1,1,\ldots,1\right) ^{T}$. Then, $\mathbf{vv}^{T}=\left( 1\right) _{1\leq
i\leq n,\ 1\leq j\leq n}$, so that $x\mathbf{vv}^{T}=x\left( 1\right)
_{1\leq i\leq n,\ 1\leq j\leq n}=\left( x\right) _{1\leq i\leq n,\ 1\leq
j\leq n}$. Hence, it is easy to see that $A=\mathbf{A}+x\mathbf{vv}^{T}$.
Recall that $\mathbf{v}=\left( 1,1,\ldots,1\right) ^{T}$. Thus, for every
$n\times n$-matrix $B$, we have
$\mathbf{v}^{T}B\mathbf{v}=\left( \text{the sum of all entries of }B\right)
$.
Applying this to $B=\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq
j\leq n}$, we obtain
$\mathbf{v}^{T}\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq
n}\mathbf{v}$
$=\left( \text{the sum of all entries of }\left( b_{i}\delta_{i,j}\right)
_{1\leq i\leq n,\ 1\leq j\leq n}\right) $
$=\sum\limits_{i=1}^{n}\sum\limits_{j=1}^{n}b_{i}\delta_{i,j}=\sum\limits_{i=1}^{n}b_{i}
\underbrace{\sum\limits_{j=1}^{n}\delta_{i,j}}_{=1}=\sum\limits_{i=1}^{n}b_{i}$.
But from $A=\mathbf{A}+x\mathbf{vv}^{T}$, we obtain
$\det A=\det\left( \mathbf{A}+x\mathbf{vv}^{T}\right) $
$=\underbrace{\det\mathbf{A}}_{=\prod\limits_{i=1}^{n}\left( a_{i}-x\right)
}+\mathbf{v}^{T}\underbrace{\left( \operatorname*{adj}\mathbf{A}\right)
}_{=\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}
}x\mathbf{v}$
(by Lemma 2, applied to $\mathbf{u}=x\mathbf{v}$)
$=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +\mathbf{v}^{T}\left( b_{i}
\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}x\mathbf{v}$
$=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +x\underbrace{\mathbf{v}^{T}\left(
b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\mathbf{v}}
_{=\sum\limits_{i=1}^{n}b_{i}}$
$=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +x\sum\limits_{i=1}^{n}b_{i}$.
This proves Theorem 1.