I know that for a finite number of intervals, I can always take the maximum $N$ in each interval, and then my function sequence $f_n(x)$ will uniformly converge in all of the intervals.
but when I have an infinite number of intervals - I know that this is not true, but I am looking for a counter example, I though of $\frac{1}{1+nx}$ - I can say that it uniformly converge in every closed interval inside - like $[1/2,1]$ $[1/4,1/2]$ $[1/4,1/8]$ ... but not on the whole interval $[0,1]$. but something feels "off" with that, I am not sure if thats true
big thanks