Here is an outline of a proof of the almost sure convergence of $t^{\frac{1}{n(t)}}$ to $e$ (i.e. why the "handwavy appeal to the law of large numbers here" can be invoked). This is very far from trivial; to get an almost sure convergence you need to control how the process evolves, and how likely are exceptional values; a control on its average is not enough. The proof is not complete. I think that fixings the gaps in this would be very technical, without any gained insight. You may try to complete it if you are interested (but it is rather involved - I don't know if you know the methods used here, as you say that "[you] don't have too extensive a background in math").
To simplify the subject, I'll assume that we work with the uniform distribution on $[0,1]$, and that we look at the successive minima, and not the maxima. It changes nothing; as you had guessed, what works for a uniform distribution works for any non-atomic measure over $\mathbb{R}$ (just send it to the uniform measure via the distribution function).
Let $(X_n)$ be a sequence of random variables uniformly distributed in $[0,1]$. Let us define recursively two sequences of random variables $(Y_n)$ and $(\tau_n)$, with $X_0 = 1$, $\tau_0 = 0$, and:
- $\tau_{n+1} = \inf \{ m > \tau_n : X_m < Y_n \}$,
- $Y_{n+1} = X_{\tau_{n+1}}$.
In plain English, $(Y_n)$ is the sequence of minima and $\tau_n$ is the sequence of times these minima occur. If we have a new minimum $Y_n$, then the following minimum $Y_{n+1}$ will occurs when a point $X_m$ fall into the interval $[0, Y_n]$, which occurs with probability $Y_n$ at each step. Hence,
- $\tau_{n+1} - \tau_n$ has an exponential distribution of parameter $Y_n$,
- $Y_{n+1}$ is uniformly distributed in $[0, Y_n]$.
Now, let me define $Z_n = \ln (Y_n)$. A short computation shows that:
- $\mathbb{E} (Z_{n+1} | Z_n, \cdots, Z_0) = -1$;
- $\text{Var} (Z_{n+1} | Z_n, \cdots, Z_0) = 1$.
This implies that the sequence $(Z_n)$ behaves likes a random walk with constant drift. In particular, for all $\varepsilon > 0$, almost surely, for large enough $n$,
$-(1+\varepsilon) n \leq Z_n \leq -(1-\varepsilon) n,$
or, in other words,
$e^{-(1+\varepsilon) n} \leq Y_n \leq e^{-(1-\varepsilon) n}.$
This is not a trivial result, but there is an abundant literature on the subject. Heuristically, $Y_{n+1}/Y_n$ is roughly $e^{-1}$, so $\tau_{n+2} - \tau_{n+1} \sim e (\tau_{n+1} - \tau_n)$. When you say that "Experiments seem to indicate the amount of time it takes to find the next number multiples by about $3$ for each number found", you are quite close to the truth. It multiplies by about $e$.
Anyway, for large enough $n$, we get $\mathbb{P} (\tau_{n+1} - \tau_n \leq e^{(1-2\varepsilon)n}) \leq 1-e^{-e^{-(1-\varepsilon)n} e^{(1-2\varepsilon)n}} \leq e^{- \varepsilon n}$ and $\mathbb{P} (\tau_{n+1} - \tau_n \geq e^{(1+2\varepsilon)n}) \leq e^{-e^{-(1+\varepsilon)n} e^{(1+2\varepsilon)n}} = e^{-e^{\varepsilon n}}$. The sequences $(e^{- \varepsilon n})$ and $(e^{-e^{\varepsilon n}})$ are both summable, so by Borel-Cantelli lemma, almost surely, for all large enough $n$,
$e^{(1-2\varepsilon)n} \leq \tau_{n+1} - \tau_n \leq e^{(1+2\varepsilon)n}.$
Now, we sum this inequality for $0 \leq n < N$. I'll skip a few technical details (I take slightly larger margins in the exponents so as to get rid of any annoying constant); you get that, almost surely, for all large enough $N$,
$e^{(1-3\varepsilon)N} \leq \tau_N \leq e^{(1+3\varepsilon)N}.$
Moreover, $\tau_N \leq C$ is equivalent to $n(C) \geq N$ and $\tau_N \geq C$ is equivalent to $n(C) \leq N$. I'll skip another round of technicalities, but, using the fact that the function $n$ is non-decreasing, you get that, almost surely, for large enough $t$,
$\frac{\ln (t)}{1+4\varepsilon} \leq n(t) \leq \frac{\ln (t)}{1-4\varepsilon}.$
This is equivalent to the fact that, almost surely,
$\lim_{t \to + \infty} \frac{n(t)}{\ln (t)} = 1,$
thus $\lim_{t \to + \infty} t^{\frac{1}{n(t)}} = e$.