6
$\begingroup$

Assume that we are given a sequence of continuous functions $f_n(x)$ on $[0,1]$.

How to show the existence of a sequence $a_n$ and a set $A$ with $\mu(A^c)=0$ so that

$$ \lim_{ n\to \infty} \frac{f_n(x)}{a_n}=0, ~~ \forall x\in A. $$

I choose a sequence $a_n$ such that $ \mu (\phi_n) \leq 1/2^n$ where $$ \phi_n := \left\{ x: \frac{|f_n(x)|}{a_n} \geq \frac{1}{n} \right\}. $$

Since $\sum_n \mu (\phi_n) < \infty$, using Borel-Cantelli Lemma we have $\mu(\limsup_n \phi_n)=0$.

It seems okay if we say that $\limsup_n \phi_n = A^c$. How can we write it clearly in full details? Also, how can we assure the existence of $a_n$, how to construct such a sequence by means of $f_n(x)$?

Thanks!

  • 0
    This assumes that each function $f_n$ is finite almost everywhere.2012-12-18
  • 0
    Yes, that's correct due to Extreme Value Theorem.2012-12-18
  • 0
    ?? EVT has nothing to do here, none of these functions is assumed to be continuous.2012-12-18
  • 0
    oh, i forgot to say that $f_n(x)$ are continuous, thank you!2012-12-18
  • 0
    This hypothesis is irrelevant--but no big deal.2012-12-18
  • 0
    To choose such a sequence $a_n$ requires $\mu$ (a probability measure on $[0,1]$?) to be absolutely continuous (Lebesgue measure of $A\subset [0,1] \Rightarrow \mu(A)=0$).2012-12-19
  • 0
    @user31714 It does not.2014-04-16

2 Answers 2