1
$\begingroup$

Let $f, g : [0, 1] \rightarrow \mathbb R$ be continuous functions. Define $x_n(t) = f(t) + \int_0^t x_{n-1}(s)ds,$ $0 \le t \le 1, n=1,2,3,...$, where $x_0(t)=g(t), 0 \le t \le 1$. I have to show that the sequence $(x_n)$ is uniformly convergent on $[0, 1]$ and its limit is independent of $g$.

What I have done:

  • Since $f, g$ continuous on $[0, 1]$ by extreme calue theorem they attain a maximum on that interval, which means $\exists F,G$ constants such that: $|f(x)| \le F, |g(x)| \le G, \forall x \in [0,1]$.
  • Doing some calculations based on the above formula for $x_n(t)$ I have:

$|x_0(t)| \le |g(t)| \le G$

$|x_1(t)| = |f(t) + \int_0^t x_{0}(s)ds| \le |f(t)| + \int_0^t |x_{0}(s)|ds \le F+Gt$

$|x_2(t)| \le F+Ft+G\frac{t^2}{2}$

$|x_3(t)| \le F+Ft+F\frac{t^2}{2}+G\frac{t^3}{2*3}$

So, $|x_n(t)| \le F \sum_{k=0}^{n-1} \frac{t^k}{k!}+G \frac{t^n}{n!}$, which can also be written as: $|x_n(t)| \le Fe^t+G \frac{t^n}{n!}$, since $e^t=\sum_{k=0}^{\infty} \frac{t^k}{k!}$

  • Where do I go from here? How can I prove the $|x_n(t)-x(t)|<\epsilon$ relationship of the uniform convergence?

2 Answers 2

1

Define the following sequence of functions, ${F_n}$ and ${G_n}$ as follows: $$F_0(t) = f(t), \ \ F_n(t) = \int_0^tF_{n-1}(s)ds$$ $$G_0(t) = g(t), \ \ G_n(t) = \int_0^tG_{n-1}(s)ds$$ Note then that: $$x_0(t) = G_0(t)$$ $$x_1(t) = f(t) + \int_0^t g(s)ds = F_0(t) + \int_0^tG_0(s)ds = F_0(t) + G_1(t)$$ And proceeding, inductively if $$x_{n-1}(t) = \sum_{k=0}^{n-2}F_k(t) +G_{n-1}(t)$$ then $$x_n(t) = f(t) +\int_0^t x_{n-1}(s)ds = F_0(t) + \int_0^t [\sum_{k=0}^{n-2}F_k(s) +G_{n-1}(s)]ds \\ = F_0(t) + \sum_{k=0}^{n-2}\int_0^tF_k(s)ds + \int_0^t G_{n-1}(s)ds \\ = F_0(t) + \sum_{k=0}^{n-2} F_{k+1}(t) + G_n(t) \\ = F_0(t) +\sum_{k=1}^{n-1} F_{k}(t) + G_n(t) $$ And thus for all $n \geq 1$ $$x_n(t) = \sum_{k=0}^{n-1}F_k(t) + G_n(t) $$ Let $M_1,M_2$ be the maximum values of $f,g$ on $[0,1]$ respectively. Then $$\left |F_0(t) \right | \leq M_1$$ $$|F_1(t)| = |\int_0^tF_{0}(s)ds| \leq M_1t$$ Proceeding this way, we get $$|F_n(t)| \leq M_1\frac{t^n}{n!} $$ and similarly $$|G_n(t)| \leq M_2\frac{t^n}{n!} $$ So, $\sup_{t \in [0,1]}|G_n(t)| \leq M_2/n!$ and thus $G_n$ converges uniformly to $0$. Now define $$H_n(t) = \sum_{k=0}^{n-1}F_k(t)$$ Given $\epsilon > 0$, for all sufficiently large $m,n$ and $m \geq n$we have $$|H_m(t) -H_n(t)| = |\sum_{k=n}^{m-1}F_k(t)| \leq \sum_{k=n}^{m-1}|F_k(t)| \leq M_1\sum_{k=n}^{m-1}\frac{t^k}{k!} \leq M_1\sum_{k=n}^{m-1}\frac{1}{k!} < \epsilon$$ for all $t \in [0,1]$. Thus $H_n$ converges uniformly, suppose, to $H$.

Recall that $x_n(t) = H_n(t) +G_n(t)$ being the sum of two uniformly convergent sequences of functions must also be uniformly convergent and the uniform limit must be $$\lim_{n \to \infty }H_n + G_n = H + 0 =H$$ Note that $H_n$ is independent of $g$ and since $G_n$ converges to $0$, $x_n$ converges uniformly to a limit independent of $g$.

  • 0
    I can't actually see how you proceed to get the $x_n(t)$ with your way: when I tried to calculate $|x_2(t)-f(t)|$, there was a $\int_{0}^{t}|f(s)|ds$ there in my way, which I couldn't get it to just $f(t)$. Can you show me the calculation for $x_2(t)$? Also, after you substitute $t=1$ to the big inequality you produced, there is a "$-$" missing which also, since it has to be there, gives you trouble with using the absolute value below. Can you check this?2017-02-26
  • 0
    By integrating $f(t) - Mt \leq x_1(t) \leq f(t) + Mt$, we get $f(t)t - Mt^2/2 \leq \int_0^t x_1(s)ds \leq f(t)t + Mt^2/2$, that is, $f(t) + f(t)t - Mt^2/2 \leq f(t)+ \int_0^tx_1(s)ds \leq f(t) + f(t)t +Mt^2/2$ , that is, $f(t) + f(t)t - Mt^2/2 \leq x_2(t) \leq f(t) + f(t)t +Mt^2/2$, and so on.2017-02-27
  • 0
    From $-(e^t- \sum_{k=0}^{n-1} \frac{t^k}{k!})f(t) - M\frac{t^n}{n!} \leq x_n(t)- e^tf(t) \leq -( e^t -\sum_{k=0}^{n-1} \frac{t^k}{k!} )f(t) + M\frac{t^n}{n!}$, we get $-(e^t- \sum_{k=0}^{n-1} \frac{t^k}{k!})f(t) - M\frac{t^n}{n!} \leq x_n(t)- e^tf(t) \leq ( e^t -\sum_{k=0}^{n-1} \frac{t^k}{k!} )M_1 + M\frac{t^n}{n!}$, here we use $-f(t) \leq |f(t)| \leq M_1$, and then we put to use $t=1$.2017-02-27
  • 0
    Fantastic! So, in the end, the limit I understand (for $n \rightarrow \infty$ it goes to $0$) - doesn't that show that $x_n(t) \rightarrow e^tf(t)$? Also what is that about the supremum that should I take - can you explain that as well?2017-02-27
  • 0
    Again on the extreme left, we use $-f(t) \geq -M_1$, so $-(e^t- \sum_{k=0}^{n-1} \frac{t^k}{k!})f(t) = (e^t- \sum_{k=0}^{n-1} \frac{t^k}{k!})(-f(t) \geq (e^t- \sum_{k=0}^{n-1} \frac{t^k}{k!})(-M_1) $ and then we proceed as described.2017-02-27
  • 0
    From $|x_n(t)- e^tf(t)| \leq M_1|(e- \sum_{k=0}^{n-1} \frac{1}{k!})| + \frac{M}{n!} \forall t \in [0,1] $, we get $\sup_{t \in [0,1]} |x_n(t)- e^tf(t)| \leq M_1|(e- \sum_{k=0}^{n-1} \frac{1}{k!})| + \frac{M}{n!} $. Taking the limit as $n \to \infty$, we get $\sup_{t \in [0,1]} |x_n(t)- e^tf(t)| \to 0$, and so, $x_n(t)$ converges uniformly to $e^tf(t)$ and that is independent of $g$.2017-02-27
  • 0
    Ah, that's what you meant! I just show that there is another way with the sup to define uniform convergence! Thanks a lot!2017-02-27
  • 0
    Andre, I still see the problem with the integral in your first comment (computing $x_2(t)$), I mean you have $\int_{0}^{t}f(s)ds$ and you just going to $f(t)t$ below - how that happens?2017-02-27
  • 0
    Yes you are right, I possibly meant to use $M_1$ instead of $f(t)$. But still the previous proof was messy and probably erroneous. I edited to give a different proof. See above.2017-02-28
  • 0
    Andre, well done! This is a so much better proof than the previous one - fully understandable and elegant I have to say.2017-02-28
2

Since $C[0,1]$ is a complete metric space with respect to the sup norm, we can use the contraction mapping principle.

To spell it out, let $T: C[0,1] \to C[0,1]$ map $x(t) \mapsto f(t) + \int_0^t x(s) ds$. $T$ itself isn't a contraction mapping, but $T^2$ is.

Let's prove this. First, let's work out what $T^2$ does: $$T^2(x)(t) = f(t) + \int_0^t ds f(s) + \int_0^t ds \int_0^s du \ x(u) \\ = f(t) + \int_0^t ds f(s) + \int_0^t du \ \int_u^t ds x(u) \\ = f(t) + \int_0^t ds f(s) + \int_0^t du \ (t-u) x(u) $$ Hence $$|T^2(x_1)(t) - T^2(x_2)(t)| \leq \int_0^t du \ (t-u) |x_1(u) - x_2(u) | \\ \leq \sup_{t \in [0,1]} |x_1(t) - x_2(t) | \int_0^t du \ (t-u) \\ \leq \sup_{t \in [0,1]} |x_1(t) - x_2(t) | \times \frac {t^2} 2.$$ So $$\sup_{t \in [0,1]}|T^2(x_1)(t) - T^2(x_2)(t)| \leq \frac 1 2 \sup_{t \in [0,1]} |x_1(t) - x_2(t) |$$ Thus $T^2$ is a contraction mapping.

By the contraction mapping theorem, $(x_0, x_2, x_4, x_6, \dots)$ and $(x_1, x_3, x_5, x_7, \dots)$ both converge uniformly; moreover, they converge to the same limit and this limit is independent of the choice of $g$. So the original sequence $(x_0, x_1, x_2, \dots)$ also converges uniformly to a limit that is independent of $g$.

  • 0
    No I don't know this stuff! Can it not be solved more analytically?2017-02-25
  • 0
    It probably can, but the method using the contraction mapping theorem takes $\leq $ 5 seconds if you're familiar with it and I don't know any other method.2017-02-25
  • 0
    Do you understand the statement of the contraction mapping theorem, as spelt out in wikipedia, https://en.wikipedia.org/wiki/Banach_fixed-point_theorem#Statement ?2017-02-25
  • 0
    I just read it - it guarantees that there is a fixed point in that mapping you just did and guess that's were the $x_n(t)$ sequence converges correct? But is it uniformly convergent there and the how do I know with this method that the limit is indepedent of $g$?2017-02-25
  • 0
    The metric space is $C[0,1]$ endowed with the sup norm. Convergence of a sequence of functions with respect to the sup norm is the same thing as uniform convergence. So that bit is fine.2017-02-25
  • 0
    Independence of $g$ follows from the fact the the limit of the sequence $\{ T^n (g) \} $ is the (unique) fixed point of $T$ regardless of the choice of $g$. Look at the bit in the wikipedia page where it says "start with an arbitrary $x_0$.2017-02-25
  • 0
    Does this help or is it still not clear?2017-02-25
  • 0
    Yes its pretty clear, your answer is impeccable! But, I would wait for someone to provide an analytical solution to this...2017-02-25
  • 0
    I strongly recommend learning this contraction mapping technique though. It is really worth it. The same technique is used to prove that (a large class of) differential equations have solutions, which is a pretty important result.2017-02-25
  • 0
    @KennyWong: Actually, the inequality $\Vert T(x)\Vert\le\Vert x\Vert$ isn't sufficient for $T$ to be a contraction mapping. You'll have to prove that $\Vert T(x)\Vert\le\delta\Vert x\Vert$ for some $\delta<1$.2017-02-25
  • 0
    Good point! I'll remove this.2017-02-25
  • 0
    @SergeiGolovan Could you please take a look at the new version? Please do let me know if you still think I'm making a mistake, because this is nice problem and I really would like to understand the solution technique.2017-02-26
  • 0
    @KennyWong: Nice idea, to consider $T^2$ and show that it contracts.2017-02-26