2
$\begingroup$

Specifically, if $\lim\limits_{x \to a} g(x) = b$, and if $f$ is continuous at$ b$, then the $\lim\limits_{x \to a} f \circ g(x) = f(b)$.

  • 0
    This looks like homework; please read http://meta.math.stackexchange.com/questions/1803/how-to-ask-a-homework-question.2012-04-11
  • 1
    Yes, it is. Sorry about that. New to the site, thanks for the heads up.2012-04-12

1 Answers 1

1

We need to show given $\epsilon > 0$, there exists a $\delta > 0$ so that $|x-a| < \delta \Rightarrow |f(g(x))-f(b)|<\epsilon$. So first find $\delta' > 0$ so that $|z-b| < \delta' \Rightarrow |f(z)-f(b)| < \epsilon$

(we can find a $\delta'$ because $f$ is continuous at $b$).

Then find $\delta > 0$ so that $|x-a| < \delta \Rightarrow |g(x)-g(a)| < \delta'$

(we can find a $\delta$, because $g$ is continuous at $a$).

Now collecting this together, if $|x-a| < \delta$ then $|g(x)-g(a)|=|g(x)-b| < \delta'$. But ("taking" $z=g(x)$), this means that $|f(g(x))-f(b)| < \epsilon$, as desired.

  • 0
    Thank you John. So clear and concise. Very helpful.2012-04-11