-1
$\begingroup$

I am fairly new to matrices, especially stochastic matrices. In an effort to become more comfortable with them I am doing working out some problems. One of them that is giving me a hard time is to calculate the characteristic row-vector associated with eigenvalue 1 of the following matrix:

$$\begin{bmatrix} .2 & .6 & .2\\ .5 & 0 & .5\\ .25& .5 &.25 \end{bmatrix}$$

I am not quite sure how to start this problem out. I have tried reducing the matrix down as well as the transpose of the matrix but neither has proved fruitful so far.

  • 0
    Stochastic matrices are usually defined so that entries in columns sum to $1$, not entries in rows.2012-03-08
  • 0
    The definition of stochastic matrix that I was given is a matrix with all elements non-negative and each row sums to 1, with the property that for some power of teh matrix all elements are positive2012-03-08
  • 0
    OK then in that case you are right-multiplying matrices to row vectors. It's more common to left-multiply matrices to column vectors.2012-03-08
  • 0
    well if it wanted the characteristic column-vector then the left-multiplying would be used. Can you point me in the right direction as to how to start this ?2012-03-08
  • 0
    The point of having entries in a column sum to $1$ (aka probability vectors for columns) is that if you are left-multiplying matrices to vectors (as is more often the case), then a stochastic matrix times a probability vector still works out to be a probability vector. If you have rows with entries adding to one, then the same ideas work, but now you must right-multiply the matrix by a transposed vector.2012-03-08
  • 0
    @alex.jordan, my experience is that mathematicians have the columns sum to one while Econ people and perhaps statisticians have the rows sum to one. Whenever I teach Markov chains to actuarial students I have to warn them that the way I do it differs from the way they'll see in their other classes.2012-03-08
  • 0
    @Gerry Thanks - that's good to know.2012-03-08
  • 0
    Here's a nice tutorial [in PDF](http://www.scss.tcd.ie/Rozenn.Dahyot/CS1BA1/SolutionEigen.pdf) on how to compute *right* eigenvalues and eigenvectors. In your case, you're computing *left* eigenvector corresponding to left eigenvalue $\lambda = 1.$2012-03-08

1 Answers 1