I recently got a tute question which I don't know how to proceed with and I believe that the tutor won't provide solution... The question is
Pick a real number randomly (according to the uniform measure) in the interval $[0, 2]$. Do this one million times and let $S$ be the sum of all the numbers. What, approximately, is the probability that
a) $S\ge1,$
b) $S\ge0.001,$
c) $S\ge0$?
Express as a definite integral of the function $e^\frac{-x^2}{2}$.
Can anyone show me how to do it? It is in fact from a Fourier analysis course but I guess I need some basic result from statistcs which I am not familiar with at all..