I am trying to show that if $f(x) \geq 0$ for every $x \in (-\infty, a)$ and $ \lim_{x \rightarrow a^-} f(x)$ exists then $\lim_{x \rightarrow a^-} f(x) \geq 0$. Even though it is intuitively obvious, the proof I have come up with is so easy it concerns me that I'm missing something:
Suppose that $ \lim_{x \rightarrow a^-} f(x) = L < 0$. This means that for every $\epsilon > 0$ there exists a $\delta > 0$ such that $|x - a| < \delta \implies |f(x) - L| < \epsilon$ and $x < a$. Since $L < 0$ and $f(x) \geq 0$, $f(x) - L > 0$ so $0 < f(x) - L < \epsilon$. Now, choose $\epsilon = -L$. Then
$ 0 < f(x) -L < -L \implies 0 < f(x) < 0 $
which is a contradiction, hence, $L\geq 0$
Does this look right?