0
$\begingroup$

I am facing some issues with modelling a markov chain.

An online shop has noticed the following buying behavior of their customers. There are three products x1,x2 and x3.

  • If someone buys product x1 he will buy the product again with probability of 1.
  • If someone buys product x2 he will buy the product again with the probability of 1/2
  • If someone buys product x3 he will buy the product again with the probability of 1/3
  • otherwise the customer will choose randomly one of the other two products with same probability

As far as I know there are states in the markov chain x1, x2 and x3. They will be looped with the given probabilities 1, 1/2 and 1/3. But I don't know how to determine the probabilities between the three states. Can someone help me.

Thank you

1 Answers 1

1

For state $X2$ it is said that it loops with probability $1/2$ and that if it does not loop, it goes to any of the other two with equal probability. Hence it means that it goes to $X1$ with probability $1/4$ and to $X3$ with same probability.

For state $X3$ it is said that it loops with probability $1/3$ and that if it does not loop, it goes to any of the other two with equal probability. Hence it means that it goes to $X2$ with probability $1/3$ and to $X1$ with same probability.

Hence the transition matrix is

\begin{array}{lll} 1 &0 &0\\ \frac1{4}&\frac1{2}&\frac1{4}\\ \frac1{3}&\frac1{3}&\frac1{3} \end{array}

  • 1
    So eventually, the site will have to stock only product x1. An absorbing product!2017-02-18