First consider only one renewal process, describing i.i.d. interarrival times with integrable distribution $F$. Asymptotically, the distribution of the largest interval around the present time without any arrival is the size-biased transform of $F$ (more on this later) and the present time is uniformly distributed in this interval.
In other words, assume that $X$ is a random variable with distribution $F$. Then the size-biased transform of $F$ is the distribution of any random variable $\hat X$ such that $ E(h(\hat X))=\frac{E(Xh(X))}{E(X)}, $ for every bounded measurable function $h$. And the time elapsed since the last arrival is asymptotically distributed like $U\hat X$, where $U$ is uniform on $(0,1)$ and independent on $\hat X$.
Now, consider two independent renewal processes, with respective integrable distributions $F$ and $G$. The asymptotic probability that the last event corresponds to the $F$ renewal process is $ p_F=P(U\hat X\le V\hat Y), $ where $U$, $\hat X$, $V$ and $\hat Y$ are independent, $U$ and $V$ are uniform on $(0,1)$, the distribution of $\hat X$ is the size-biased transform of $F$ and the distribution of $\hat Y$ is the size-biased transform of $G$. Thus, $ p_F=\frac{E(XY;UX where $X$, $Y$, $U$ and $V$ are independent, $U$ and $V$ are uniform on $(0,1)$, the distribution of $X$ is $F$ and the distribution of $Y$ is $G$.
Our next task is to get rid of $U$ and $V$. To condition on $(X,Y)$, one needs to compute $ P(Ux Unless I made a mistake, this yields
$p_F=\frac{E(Y^2;Y
Note that this is only one among several algebraically equivalent formulas for $p_F$.
All the expectations in the formula for $p_F$ are integrals involving $F$, $G$ and the respective densities $f$ and $g$. For example, $ E(Y^2;Y Post hoc checks: Here are some properties of the result above, which hold and ought to, for the formula to make sense.
(1) One has $p_F+p_G=1$ (where $p_G$ is the result one gets interchanging $X$ and $Y$) and $p_F$ is obviously positive, hence $p_G$ is positive as well. This proves that $p_F$ is in $(0,1)$.
(2) If $F=G$, everything cancels out in the numerator except the $E(2XY;X term, hence $p_F=\frac12$.
(3) If $X$ is exponential with parameter $a$ and $Y$ is exponential with parameter $b$, $p_F=a/(a+b)$.
Edit Regarding check (3) above, for exponential $a$ and $b$ distributions, one can compute everything in the formula giving $p_F$ as a function of $a$ and $b$, starting with $E(X)=1/a$, $E(Y)=1/b$, $f(x)=a\mathrm{e}^{-ax}$, $F(x)=1-\mathrm{e}^{-ax}$, $g(y)=b\mathrm{e}^{-by}$, $G(y)=1-\mathrm{e}^{-by}$.
Furthermore, $E(Y^2;Y hence $ E(Y^2;Y by symmetry, $E(X^2;X, and finally, $ E(XY;X hence $E(XY;X. Simplifying everything yields the value of $p_F=a/(a+b)$ given above.
Or, one can remember that for exponential interarrival times the arrival times form Poisson processes, that the superposition of two independent Poisson processes with intensities $a$ and $b$ is itself a Poisson process with intensity $a+b$, and finally that each point in the resulting point process is either an $a$-point or a $b$-point, independently of everything else and with respective probabilities $a/(a+b)$ and $b/(a+b)$. Putting all this together, one sees directly why the last event is an $a$-event with probability $p_F=a/(a+b)$.