2
$\begingroup$

So this thought popped into my head while attempting to solve a grad-level real analysis assignment. Though this question is probably basic, I am finding it hard to justify a claim I am making (I am not a math major).

Firstly, the question I am trying to tackle basically states the following: $\{f_n, n = 1, ...\}$ are real valued and twice differentiable. They also converge uniformly to an arbitrary function, $f$, and the second derivatives, $f_n''$, are uniformly bounded. I essentially have to show that the first derivative, $f_n'$, converges uniformly to $f'$. I suppose this is a standard problem (precisely, this is the Lipschitz condition).

I was thinking in the lines of proof by contradiction. At first, I assumed $f_n'$ doesn't converge to $f'$. So, roughly speaking, there will be some $x^*$ where the difference between $f_n(x)$ and $f_m(x)$ is greater than some $\epsilon > 0$ $\forall n,m > n_0$ for some $n_0 \in \mathbb{N}$. Then, I wished to consider a $\delta$-neighborhood of $x^*$. Then, for all $x$ in this neighborhood, without loss of generality, if we assume that $f_n'$ converges to $f'$ pointwise, then I intuitively think that the second derivative should become arbitrarily large at that point. Because, for all $x$ in this neighborhood, $| f_n'(x) - f_m'(x)| < \epsilon$ (for sufficiently large $n, m$) while but at $x^*$ it is greater than the same $\epsilon$ (where I pick $\epsilon$ as in the definition of $x^*$).

I am pretty sure that to make the above argument rigorous, I need to use the fact that the second derivative is uniformly bounded. I also have a hunch that if I can argue that since the second derivative is uniformly bounded, then for any chord (that is line segment joining two points of a curve of a function in $\mathbb{R}$, just my "term" for this) then I can pick one of the two points to be $x^*$ and the other point to be in the $\delta$-neighborhood and then somehow use this property to write in mathematical terms the argument I presented above.

I want to know if my argument is sound and if I can indeed claim the above property via uniform boundedness and if indeed the above technique can prove my argument mathematically.

A clarification of the original question: The question needs me to show that $f_n' \rightarrow f'$ uniformly and that $\exists C > 0$ such that $|f'(x) - f'(y)| \leq C|x - y| \hspace{1 pc}\forall x, y \in [a, b]$.

  • 0
    You've assumed $f_n'$ does not converge to $f'$, but for contradiction, you need to assume they don't converge uniformly. I think you can prove this in the positive direction. Hint: the fact that $f_n''$ are uniformly bounded means that the $f_n'$s are uniformly Lipschitz. That is, $\exists M : \forall x,y,n, |f_n'(x) - f_n'(y)| < M|x - y|$.2012-09-08
  • 0
    Hi, the fact that you have used is exactly what I need to prove! I can't use the same thing to prove something! Moreover, I realized that the converse is not exactly what I did, but nor is the converse what you stated. Specifically, a sequence of functions can converge pointwise and still not be uniformly convergent. And if a sequence of functions don't converge pointwise (what I assumed) they will NOT converge uniformly. Assuming they don't converge uniformly seems to implicitly assume at least convergence, which itself is something I need to prove... I may be doing something wrong however.2012-09-08
  • 0
    On what domain are the functions $f_n, f$ defined?2012-09-08
  • 0
    Think about your logic: you are trying to say "assume they do not converge pointwise. Then ... contradiction." This would only prove that they do converge pointwise; it would not prove uniform convergence.2012-09-08
  • 0
    The domain of $f_n$ and $f$ is $\[a, b\] \subseteq \mathbb{R}$ . I haven't noticed this myself! Can this help? I agree with you Nate. This was going to be the first step of my proof. I hoped to show uniform convergence in a later step.2012-09-08
  • 0
    @Abhijit - "Does not converge uniformly" does NOT imply "Does not converge". For contradiction, the hypothesis you need to start with would be "$\exists \epsilon > 0$ such that $\forall N, \exists n > N, \exists x$ such that $|f_n(x) - f(x)| > \epsilon$".2012-09-08
  • 0
    Regarding showing they are uniformly Lipschitz, that's by the mean value theorem: for each $f_n', x,y \in [a,b], \exists z \in (x,y)$ so that $|f_n'(x) - f_n'(y)| = |f''_n(z)||x - y|$. Since the $f''_n$s are uniformly bounded, this gives $\exists M: \forall x,y,n, |f_n'(x) - f_n'(y)| < M|x - y|$.2012-09-08
  • 0
    @BaronVT. I think I can do proof by contradiction more straightforwardly as you suggested than with my hypothesis of starting from "does not converge". I have a small doubt: Is the contradiction of uniform convergence this: $\exists \epsilon > 0$ such that $\forall n_0 \in \mathbb{N}, \exists x \in [a, b],$ such that $\exists n \geq n_0$ ... Or is this the same as what you have stated?2012-09-08
  • 0
    @BaronVT, for some reason, I am a bit uneasy to use mean value theorem as I feel that that is probably not rigorous. I mean, I don't know what assumptions the MVT uses and if some of those assumptions are something yet to be proven via fundamental real analytical techniques, then I am essentially using a higher level theorem to prove a basic problem. It is overkill and specifically, probably will not help aid my understanding of real analysis. So I am trying to prove this thing based solely on the facts given and basic notions in real analysis without invoking specific theorems about functions2012-09-08
  • 0
    Almost - the final qualifier is $\exists n \geq n_0$, otherwise it's the same. The main point is, the $x$ needed to violate the condition isn't static (like it would be for simply "not conv,"). It's like saying, for a small enough $\epsilon$, no matter how big $n_0$ gets, there is still \textit{some} $x$ where $|f_n'(x) - f'(x)| > \epsilon$ (in practice, this $x$ probably changes as $n_0$ gets bigger. Think about a typical examples: $g_n = x^n$ on $[0,1]$ conv. to $0 (0\leq x < 1); 1 (x = 1)$. This doesn't conv. uniformly - as you get closer and closer to $x=1$, the rate of conv. gets slower.2012-09-08
  • 0
    @Abhijit, Ok, suit yourself, but I think your proof will wind up proving the mean value theorem as a consequence. Not only do you need to use uniform convergence, you're going to have to use properties of derivatives before you're done; I think this is where your "chord" ideas are leading you - the mean value theorem just says the slope of the chord connecting two points on the graph of differentiable function is equal to the derivative somewhere "inside the chord". By the way, MVT is completely rigorous in this setting, and it is usually introduced right after differentiation (cf Rudin Ch 5).2012-09-08

1 Answers 1

1

Why don't you first try and show that a subsequence of $f^{'}_n$ converges uniformly to $f^{'}$? I think this is an application of a well known theorem called the Arzela Ascoli theorem. The Arzela-Ascoli theorem states that if we have a sequence $\{f_n\} \in C[a,b]$ such that $\{f_n\}$ is uniformly bounded and equicontinuous, then there exists a subsequence$\{f_{n_k}\} \to f \in C[a,b]$ uniformly. We are now going to show the uniform boundedness and equicontinuity of $f^{'}_n$. This will tell us that there exists $f^{'}_{n_k}$ that converges uniformly to $g \in C[a,b]$. We can then show that $g = f^{'}$

We know there exists $\gamma \in [a,b]$ such that $$|f^{'}_n(x) - f^{'}_n(y)| = |f^{''}_n(\gamma)||x-y| \leq M|x-y|$$ by Taylor's theorem and the fact that $f^{''}_n$ is uniformly bounded. Thus, $f^{'}_n$ is equicontinuous.

To show uniform boundedness of $f^{'}_n$ we do the following: $$|f^{'}_n(x)| = |f^{'}_n(x) - f^{'}_n(0) + f^{'}_n(0)| = |f^{''}_n(\gamma)x + f^{''}_n(0)| \leq M(1+|x|).$$ The above follows by Taylor's theorem and triangle inequality (and the fact that $f^{''}_n$ is uniformly bounded.) Thus, by Arzela Ascoli, there exists a subsequence $\{f_{n_k}\} \to g$, $g \in C[a, b]$

For the rest note that $$f_{n_k}(x) - f_{n_k}(a) = \int_{a}^{x}{f^{'}_{n_k}dt}$$ Since the $f^{'}_{n_k}$ converge uniformly, so does the integral. Thus, $f(x) - f(0) = \int_{a}^{x}{g(t)dt}$. By the Fundamental Theorem of Calculus, $g = f'$ so $f^{'}_{n_k} \to f^{'}.$