I want to show that $\lim_{x\rightarrow\infty} f(x) = \lim_{t\rightarrow 0^+}f(1/t)$. I need to show that the LHS has a limit $L$ if and only if the RHS has limit $L$.
So let's assume that $ \lim_{x\rightarrow\infty} f(x) = L $, which is equivalent to saying that $$ \forall \epsilon\ \exists M > 0\ \forall x, x > M\colon |f(x) - L| < \epsilon. \qquad (1) $$ What I'm trying to derive from the above is $$ \forall \epsilon\ \exists \delta > 0\ \forall t, t \in (0,\delta)\colon |f(1/t) - L| < \epsilon. \qquad (2) $$ Now I'm trying to substitute $x$ by $1/t$ in (1), yielding $$ \forall \epsilon\ \exists M > 0\ \forall t, 1/t > M\colon |f(1/t) - L| < \epsilon, $$ and thus $$ \forall \epsilon\ \exists M > 0\ \forall t, 0 < t < 1/M\colon |f(1/t) - L| < \epsilon. $$ If I replace $M$ by $1/\delta$, then I seem to be getting $$ \forall \epsilon\ \exists (1/\delta) > 0\ \forall t, t \in (0,\delta)\colon |f(1/t) - L| < \epsilon. $$ Is this equivalent to (2)? If so, why?