Assume that we are given a sequence of continuous functions $f_n(x)$ on $[0,1]$.
How to show the existence of a sequence $a_n$ and a set $A$ with $\mu(A^c)=0$ so that
$ \lim_{ n\to \infty} \frac{f_n(x)}{a_n}=0, ~~ \forall x\in A. $
I choose a sequence $a_n$ such that $ \mu (\phi_n) \leq 1/2^n$ where $ \phi_n := \left\{ x: \frac{|f_n(x)|}{a_n} \geq \frac{1}{n} \right\}. $
Since $\sum_n \mu (\phi_n) < \infty$, using Borel-Cantelli Lemma we have $\mu(\limsup_n \phi_n)=0$.
It seems okay if we say that $\limsup_n \phi_n = A^c$. How can we write it clearly in full details? Also, how can we assure the existence of $a_n$, how to construct such a sequence by means of $f_n(x)$?
Thanks!