I'll try to be as brief as possible:
I have a number of events that can happen, say e1, e2...eN. N isn't particularly large. Each event has a probability of "failure". (Actually, there will likely be an overall rate, that will be converted to a weighted probability based on some properties of the e1, e2, etc, but that's probably not that important).
Upon a "failure", you have to start from the beginning, and incur a cost, c1, c2, cN, for failing on e1, e2, eN etc.
The entire process has a fixed cost for completion, C.
So, let's say you fail on the second event, your cost could be c2 + C.
If you failed on the third event, and then on the first, your cost would be c3 + c1 + C.
And so on...
My questions is this: What would be the closed form solution for the expectation on the cost of this system, for a large number of trials?
[I mentioned Craps because you can sort of think of the events as die rolls, with failures being crapping out, etc. Perhaps it's similar to asking what the expected number of rolls before a failure would be. Maybe it's not that similar, but it's the closest analogue I could think of.)
Thanks so much!