From the Theory of Distributions, we know that if $u \in L^1_{loc}(\mathbb{R})$ and $v(x)=\int_0^x u(t)dt$ then $v$ is continuous on $\mathbb{R}$ and $v'=u$ in the sens of distributions.
Now, if I suppose that $u$ is continuous on some interval $[a,b]$ with $x\leq b$, we know from the classical "fundamental theorem of calculus" that $v'(x)=u(x)$ in the strong sens.
How can we prove rigorously the second result above (so with the case $u$ is continuous) with arguments from distribution theory (I guess involving the derivative of indicator function) ?