3
$\begingroup$

I'm having troubles solving the following determinant:

$$\left| \begin{array}{cccc} a_1 & x & \ldots & x \\ x & a_2 & \ldots & \vdots \\ \vdots & \ldots & \ddots & x \\ x & \ldots & x & a_n \\ \end{array} \right|$$

I have tried coming up with recurrent formula, but there's always one lingering row and column. I also tried finding values of x for which determinant is 0, hoping I could get a nice polynomial solution, with no luck.

By subtracting first row from every other I get somewhat nicer result, but there's still that one row/column which I can't solve. If all other columns are added to the first one and then I subtract the first row, then the column doesn't depend on x anymore, but I haven't found 1..n much more useful.

  • 0
    Possible duplicate of [How to calculate the following determinants (all ones, minus $I$)](http://math.stackexchange.com/questions/84206/how-to-calculate-the-following-determinants-all-ones-minus-i)2017-01-23
  • 0
    @amd the diagonal entries are different here.2017-01-23
  • 0
    Even though the question is different, the answer is similar; however, I cannot pull out any factor after adding everything to the first row, since I have different numbers on the diagonal (I have $$a_i+(n-1)x$$ instead of $$a+(n-1)x$$)2017-01-23
  • 4
    Your matrix is $D + xk^T k$, where $D$ is the diagonal matrix $\operatorname{diag}\left(a_1-x,a_2-x,\ldots,a_n-x\right)$, and $k$ is the row vector $\left(1,1,\ldots,1\right)$. Now, use the fact so inventively and unambiguously christened the "matrix determinant lemma" ( https://en.wikipedia.org/wiki/Matrix_determinant_lemma#Statement , see the version without the invertibility requirement).2017-01-23
  • 1
    @darij grinberg You should give your comment as an answer.2017-01-23

1 Answers 1

7

As requested, here is the answer I hinted to in my comment:

Theorem 1. Let $n\in\mathbb{N}$. Let $\mathbb{K}$ be a commutative ring. Let $a_{1},a_{2},\ldots,a_{n}$ be $n$ elements of $\mathbb{K}$. Let $x\in\mathbb{K}$. Let $A\in\mathbb{K}^{n\times n}$ be the $n\times n$-matrix $\left( \begin{array} [c]{ccccc} a_{1} & x & x & \cdots & x\\ x & a_{2} & x & \cdots & x\\ x & x & a_{3} & \cdots & x\\ \vdots & \vdots & \vdots & \ddots & \vdots\\ x & x & x & \cdots & a_{n} \end{array} \right) $ (this is the $n\times n$-matrix whose diagonal entries are $a_{1},a_{2},\ldots,a_{n}$, while all its other entries are $x$). For each $i\in\left\{ 1,2,\ldots,n\right\} $, set $b_{i}=\prod\limits_{\substack{k\in \left\{ 1,2,\ldots,n\right\} ;\\k\neq i}}\left( a_{k}-x\right) $. Then,

$\det A=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +x\sum\limits_{i=1}^{n}b_{i}$.

To prove this, we recall the following fact (known as the matrix determinant lemma, although it would not surprise me if it has several contenders for its rather generic name):

Lemma 2. Let $n\in\mathbb{N}$. Let $\mathbb{K}$ be a commutative ring. Let $\mathbf{A}\in\mathbb{K}^{n\times n}$ be an $n\times n$-matrix. Let $\mathbf{u}\in\mathbb{K}^{n\times1}$ and $\mathbf{v}\in\mathbb{K}^{n\times1}$ be two column vectors. Then,

$\det\left( \mathbf{A}+\mathbf{uv}^{T}\right) =\det\mathbf{A}+\mathbf{v} ^{T}\left( \operatorname*{adj}\mathbf{A}\right) \mathbf{u}$.

(Here and in the following, $\operatorname*{adj}\mathbf{A}$ denotes the adjugate of $\mathbf{A}$.)

See Matrix determinant lemma with adjugate matrix for a proof of Lemma 2.

We also shall use the following simple fact:

Lemma 3. Let $n\in\mathbb{N}$. Let $\mathbb{K}$ be a commutative ring. Let $a_{1},a_{2},\ldots,a_{n}$ be $n$ elements of $\mathbb{K}$. For each $i\in\left\{ 1,2,\ldots,n\right\} $, set $b_{i}=\prod\limits_{\substack{k\in \left\{ 1,2,\ldots,n\right\} ;\\k\neq i}}a_{k}$. Then,

$\operatorname*{adj}\left( \left( a_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}$.

(Here, as usual, $\delta_{i,j}$ is the Kronecker delta of $i$ and $j$.)

Proof of Lemma 3. Lemma 3 simply says that the adjugate of the diagonal matrix with diagonal entries $a_{1},a_{2},\ldots,a_{n}$ is the diagonal matrix with diagonal entries $b_{1},b_{2},\ldots,b_{n}$. This is easy to check (since each $\left( n-1\right) \times\left( n-1\right) $-submatrix of a diagonal matrix either is a diagonal matrix itself, or has a zero row).

Now, we can prove Theorem 1:

Proof of Theorem 1. Let $\mathbf{A}$ be the $n\times n$-matrix $\left( \left( a_{i}-x\right) \delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n} $. (This $\mathbf{A}$ is the diagonal matrix with diagonal entries $a_{1}-x,a_{2}-x,\ldots,a_{n}-x$). Thus, $\det\mathbf{A}=\prod\limits_{i=1} ^{n}\left( a_{i}-x\right) $ (since the determinant of a diagonal matrix is the product of its diagonal entries). But Lemma 3 (applied to $a_{1} -x,a_{2}-x,\ldots,a_{n}-x$ instead of $a_{1},a_{2},\ldots,a_{n}$) yields that

$\operatorname*{adj}\left( \left( \left( a_{i}-x\right) \delta _{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\left( b_{i} \delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}$.

Since $\left( \left( a_{i}-x\right) \delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}=\mathbf{A}$, this rewrites as

$\operatorname*{adj}\mathbf{A}=\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}$.

Let $\mathbf{v}\in\mathbb{K}^{n\times1}$ be the column vector $\left( 1,1,\ldots,1\right) ^{T}$. Then, $\mathbf{vv}^{T}=\left( 1\right) _{1\leq i\leq n,\ 1\leq j\leq n}$, so that $x\mathbf{vv}^{T}=x\left( 1\right) _{1\leq i\leq n,\ 1\leq j\leq n}=\left( x\right) _{1\leq i\leq n,\ 1\leq j\leq n}$. Hence, it is easy to see that $A=\mathbf{A}+x\mathbf{vv}^{T}$.

Recall that $\mathbf{v}=\left( 1,1,\ldots,1\right) ^{T}$. Thus, for every $n\times n$-matrix $B$, we have

$\mathbf{v}^{T}B\mathbf{v}=\left( \text{the sum of all entries of }B\right) $.

Applying this to $B=\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}$, we obtain

$\mathbf{v}^{T}\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\mathbf{v}$

$=\left( \text{the sum of all entries of }\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) $

$=\sum\limits_{i=1}^{n}\sum\limits_{j=1}^{n}b_{i}\delta_{i,j}=\sum\limits_{i=1}^{n}b_{i} \underbrace{\sum\limits_{j=1}^{n}\delta_{i,j}}_{=1}=\sum\limits_{i=1}^{n}b_{i}$.

But from $A=\mathbf{A}+x\mathbf{vv}^{T}$, we obtain

$\det A=\det\left( \mathbf{A}+x\mathbf{vv}^{T}\right) $

$=\underbrace{\det\mathbf{A}}_{=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) }+\mathbf{v}^{T}\underbrace{\left( \operatorname*{adj}\mathbf{A}\right) }_{=\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n} }x\mathbf{v}$

(by Lemma 2, applied to $\mathbf{u}=x\mathbf{v}$)

$=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +\mathbf{v}^{T}\left( b_{i} \delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}x\mathbf{v}$

$=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +x\underbrace{\mathbf{v}^{T}\left( b_{i}\delta_{i,j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\mathbf{v}} _{=\sum\limits_{i=1}^{n}b_{i}}$

$=\prod\limits_{i=1}^{n}\left( a_{i}-x\right) +x\sum\limits_{i=1}^{n}b_{i}$.

This proves Theorem 1.

  • 0
    @becko: Your answer uses division, which in general requires a Zariski-density-style argument; I made the conscious choice to avoid that in my post. Other than this, your answer is indeed simpler.2018-09-26
  • 0
    This answer looks overly complicated. The OP question can be answered more directly, please see https://math.stackexchange.com/a/2931799/10063. Also (as in your comment), the result can be computed simply applying the Sherman Morrison formula.2018-09-26
  • 0
    @becko: The Sherman-Morrison uses division, too :)2018-09-26
  • 0
    I meant for real matrices. I think your proof is much more ambitious.2018-09-26