(This is a differently formulated version of the question. There were no answers, comments or votes on the first version so I thought I'd give it another shot.)
Suppose a server processes jobs that take a random amount of time to process. Denote by $B$ the random variable service time. Suppose that B is exponentially distributed with parameter $\mu$, so $f_{B}(t) = \mu e^{-\mu t}$ and $P(B < t) = 1 - e^{-\mu t}$.
Now we want to consider the possibility that the server fails to process a job and has to start over. In particular, we consider the cases
(1) the breakdown occurs after a fixed amount of time;
(2) the breakdown occurs after a random amount of time, according to some probability density function.
Let B' denote the time passed until the processing fails. Let $p$ be the probability that a job fails to be processed and thus causes the server to start over again. I want to know the distribution, in both cases, of the time it takes for a job to be processed.
I have made attempts at both cases but I'm not sure I'm going about it the right way.
For the first case, suppose the server restarts $N$ times (a random amount). Then I'm interested in P(NB' + B \leq t), which I could compute using the law of total probability, ie \begin{eqnarray*}P(NB' + B \leq t) &=& \sum_{k=1}^{\infty} P(NB' + B \leq t|N=k)P(\text{the service fails k times}) \\&=& \sum_{N=1}^{\infty} P(NB' + B \leq t|N=k)p^k \\&=& \sum_{k=1}^{\infty} P(B\leq t - kB')p^k\end{eqnarray*} is this correct or is this nonsense?
As for the second case, I'd suppose that we have iid random variables B'_1,B'_2,\ldots,B'_N which denote the length of time it takes to break down, so then I'd guess that we would want to know P(B +\sum_{i=1}^{N} B'_i \leq t) = \sum_{k=1}^{\infty}P(B+\sum_{i=1}^{k}B'_i \leq t|N=k)P(\text{the service fails k times})
BI have the feeling that this isn't what the answer should be. Any comments on my interpretation of the question and my suggested way of solving this problem is most appreciated.