3
$\begingroup$

Consider a Markov process on a finite state space $S$, whose dynamic is determined by a certain infinitesimal generator $Q$ (that is a matrix in this case) and an initial distribution $m$.

1) Is there anything general that can be said on the spectrum of the matrix $-Q$?

2) Is there any bounds on the eigenvalues of $-Q$? And on their ratios?

I am interested in linear algebra results as well as more probabilistic ones, maybe using information about the structure of the state space or on the process itself. If can be of any help, for the problem I am addressing, I usually have a quite big state space and the Q-matrix is quite sparse, but unfortunately without any symmetry or nice structure.

For discrete-time Markov chains there is an huge literature linking properties of the spectrum of the transition matrix $P$ to various mixing times, geometric properties of the state space, etc. Of course one can move from the continuous setting to the discrete one via uniformization and thus "translate" most of these results, but I was wondering the following

3) Is there any bounds on the eigenvalues developed specifically for the continuous time setting? And generally speaking, does anyone know any references for such continuous-time theory?

  • 1
    For each matrix intensity $Q$ there is a corresponding stochastic matrix $P$ of the underlying discrete-time Markov Chain. All the infinite-time horizon properties of the continuous time Markov Chain can be verified on the underlying discrete-time one since intensity does not matter there, only the structure. I'll try to found the formula how to obtain $P$ from $Q$ so you can study $Q$ by studying classical results for $P$2012-02-08

3 Answers 3

4

One way to approach the problem is to scale $Q$ (multiply by $1/a$) so that all entries are $< 1$. To this matrix add I and the result is a stochastic matrix. Then apply the Peron-Frobenius theorem and you get that $0$ is an eigenvalue of geometric multiplicity $1$ and all other eigenvalues have negative real part. The result is that the eigenvector whose eigenvalue is $0$ is the unique steady state if the process is irreducible

2

The Gershgorin Circle Theorem can be used to construct a set of closed balls (i.e., circles) in the complex plane that are guaranteed to contain the eigenvalues of the matrix.

This theorem is not specific to generator matrices, so there may be some other related result that takes advantage of these matrices' specific structure... I'm looking into this question myself. I'll post an update if I find something better.

0

Here are some results from linear algebra. It's not a complete answer to your question, but at least covers some part of the question.

Based on Kolmogorov's forward and backward equation $ \frac{d}{dt}P(t) = P(t)Q = QP(t)$ with $P(t)$ being the probability function of your Markov process and $P(0) = I$. The solution to this equation is $ P(t) = e^{tQ} ~~ with~ P(0)=I$ There exists $P(t) = V\Lambda U^T,~~ Q = V\tilde{\Lambda}U^T$ $P(t) = V\Lambda U^T = Ve^{t\tilde{\Lambda}}U^T = e^{V\tilde{t\Lambda}U^T} = e^{tQ} $ So the eigenvalues of $P(t)$ are $e^{\lambda_it}$, where $\lambda_i$ are the eigenvalues of Q. If you know something about the eigenvalues of $P(t)$, then you can adept this to the eigenvalues of $Q$.