5
$\begingroup$

Given a vector $\mathbf{x} \in \mathbb{R}^N$, let's define:

$$\text{diag}(\mathbf{x}) = \begin{pmatrix} x_1 & 0 & \ldots & 0 \\ 0 & x_2 & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & x_N \end{pmatrix}.$$

Moreover, let

$$\mathbf{1}= \begin{pmatrix} 1 & 1 & \ldots & 1 \\ 1 & 1 & \ldots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \ldots & 1 \end{pmatrix}.$$

Here is my question:

When is the matrix $\mathbf{M} = \text{diag}(\mathbf{x}) + \mathbf{1}$ invertible?

I was able to find some results when $x_1 = x_2 = \ldots = x_N = x$. Indeed, the matrix $M$ is singular when:

  1. $x=0$. This is trivial since $\mathbf{M} = \mathbf{1}$...
  2. $x=-N$. In this case, if you sum up all the rows (or columns) of the matrix $M$, you get the zero vector.

What can I say in the general case when $\text{diag}(\mathbf{x})$ is a generic vector?

  • 1
    Weyl's inequalities give you a nice sufficient condition for invertibility2017-01-15
  • 0
    I believe the matrix is invertible for $x_i=-1, N\neq 1$2017-01-15
  • 1
    If none of the $x_i$ are zero, you can apply the analysis of [this previous Question](http://math.stackexchange.com/questions/219731/determinant-of-rank-one-perturbations-of-invertible-matrices).2017-01-15
  • 0
    @Omnomnomnom thanks a lot. Anyway reading [this][1], I don't understand the second part of the inequality $$j + k - n \geq i \geq r + s - 1, \ldots, n.$$ Is it like saying: $$j + k - n \geq i \geq r + s - t,$$ where $t = 1, \ldots, n$, right? [1]: https://en.wikipedia.org/wiki/Weyl's_inequality#Weyl.27s_inequality_in_matrix_theory2017-01-15

2 Answers 2

4

We can easily compute the determinant of the sum $\operatorname{diag}(\mathbf{x}) + \mathbf{1}$ and check invertibility that way,

If all the diagonal entries $x_i$ are nonzero, we can apply the matrix determinant lemma for a rank one update to an invertible matrix:

$$ \det(A+uv^T) = (1 + v^T A^{-1} u) \det(A) $$

When $A$ is the matrix $\operatorname{diag}(\mathbf{x})$ and $u,v$ are vectors of all ones, this says the matrix sum is invertible unless the sum of the reciprocals $x_i^{-1}$ is $-1$.

If one of the diagonal entries is zero, say $x_1$ without loss of generality, then elementary row operations quickly show that the determinant of $\operatorname{diag}(\mathbf{x}) + \mathbf{1}$ is $\prod_{k=2}^n x_k$.

  • 2
    Notice that if there are two $x_i's$ equal to $0$ then there are two equal columns and the determinant is zero. I believe you meant $det(diag(x)+1)=\prod_{k=2}^n (x_k-1)$ in the final part.2017-01-15
  • 3
    You don't need $x$ to be entrywise nonzero if you use the identity $\det(A+uv^T)=\det(A)+v^T\operatorname{adj}(A)u$ instead.2017-01-15
  • 0
    @Daniel: Just so for having two $x_i=0$. But we add one and then subtract one from the "diagonal" entries $x_i$, so I think it's right.2017-01-15
  • 0
    @hardmath thanks a lot, anyway for $N=4$ and $x_1 = 0$, the determinant is $x_2 x_3 x_4$...2017-01-15
  • 0
    @hardmath You are right.2017-01-15
  • 0
    @hardmath sorry, I wrote before your correction.2017-01-15
  • 0
    If it were up to me, I'd call the matrix of all ones $\mathbf{J}$. @the_candyman: My bad. I'll clear it up further.2017-01-15
  • 0
    @hardmath, then, if every $x_i \neq0$ the determinant of $M$ is $$\frac{\prod_i x_i}{1 + \sum_{i} x_i^{-1}}?$$2017-01-15
  • 0
    @hardmath, while if there is only one $x_j = 0$, then it is $\prod_{i \neq j} x_i$, and for more than one $x_i = 0$, we have $0$?2017-01-15
  • 0
    @the_candyman: Right.2017-01-15
  • 0
    @hardmath. It sounds great! Thanks a lot!2017-01-15
2

i think your matrix is nonsingular iff $$1 +\frac 1{x_1}+\frac 1{x_2}+ \ldots + \frac 1{x_n} \ne 0$$

i will look at the case $n = 4$ will consider the matrix $A = D + jj^\top$ where $D$ is the diagonal matrix with entries $d_1, d_2, d_3$ and $d_4, j = (1,1,1,1)^\top.$

suppose $\lambda, x$ is an eigenvalue-eigenvector pair. then we have $$d_1x_1 + x_1 + x_2+ x_3 + x_4 = \lambda x_1, \ldots, x_1+x_2+x_3+x_4 + d_4x_4=\lambda x_4 $$ solving for $x_1$ we find that $$x_1= \frac 1{(\lambda- d_1)}(x_1+x_2+x_3+x_4), \ldots x_4= \frac 1{(\lambda- d_4)}\left(x_1+x_2+x_3+x_4\right)\tag 1 $$ adding the four equations in (1), you find that the characteristic equation of $A$ is $$1 = \frac 1{(\lambda-d_1)}+\frac 1{(\lambda-d_2)}+\frac 1{(\lambda-d_3)}+\frac 1{(\lambda-d_4)} $$

therefore the matrix $D + jj^\top$ is singular iff $$ \frac 1d_1 + \frac 1d_2 + \frac 1d_3 + \frac 1d_4 + 1 \ne 0. $$

  • 0
    thanks a lot for this! It is helping me to better understand the problem!2017-01-15