I hope this is the right place for help on this question! I expect this should be easy for this audience (and, no, this isn't homework).
I have a task that takes $X$ seconds to complete (say, moving a rock up a hill). However, sometimes the task is restarted from the beginning (say, an earthquake rolls the rock back to the bottom of the hill). I know that these restart events happen on average every $Y$ seconds ($Y$ less than $X$). I need to find out how long it will take in total to reach the top of the hill.
To be perfectly clear, suppose I start the task at time $t=0$ and, every time a "restart event" happens, I immediately restart. When I complete the task I record $f$, the final value of $t$. What is the expected value of $f$?
In my attempt to solve this, I model the restart as a Poisson event with average $\lambda=Y$. However, I'm not sure how to proceed.
p.s. the restart events are random, of course, not on a schedule happening every Y seconds (otherwise I would never be able to reach the top of the hill).