For positive semi-definite matrices, $A$ and $B$ with real entries,
Let: $X=I-(2Diag(A)-B)^{-1}(A-B)$
The spectral radius $\rho(X) \leq ||X||$.
As, $(2 Diag(A)-B)$ becomes a better approximation of $(A-B)$, $\rho(X)$ begins to approach zero.
Question: Under what conditions is $(2Diag(A)-B)$ diagonally dominant?
Background of the problem:
I was working on computing the root-convergence rate of an iterative optimization sequence and ended up with characterizing it on $\rho(X)$. Am looking for starter directions to be able to compute/bound $\rho(X)$ inorder to say something about the convergence of the algorithm. Any, starter directions/references- on how I can go about it would be appreciated.