We know that something is going to happen after $x$ amount of time, but the exact time at which the event occurs is random within $x$ time. (Like, say we did it a bunch of times where it happened in $x-y$ time, and all the $y$s were uniformly distributed reals $0$ to $x$.)
It is easy to see that you can compute the likelihood that the event has occurred by using $t/x$ with $t$ being the time that has elapsed and $x$ being the total amount of time. It's also easy to see the likelihood of it occurring in the next $n$ amount of time by computing $n/(x-t)$ (With $t$ again being the amount of time elapsed)
As $n$ gets very small, the likelihood of it happening in the next $n$ amount of time also clearly also becomes very small. This can be expressed by realizing that:
$\lim_{n \to 0}\frac{n}{x-t}=0$
But now, if you take that limit and integrate it over some period of $t$ s.t. $0
$\int_{0}^{x-a} \lim_{n \to 0} \frac{n}{x-t} dt$
Resolving the limit first clearly doesn't yield. Does that mean that the integral I have constructed does not represent what I think it does?
--
I considered using arclength in the integral but stopped myself in realizing that we are concerned with the values the function outputs, not the length of the curve.