For an explicit expression for the denisty function see MathWorld: Uniform Sum Distribution, for example.
EDIT:
The density function $f_n$ of the sum $U_1 + \cdots + U_n$, where $U_i$ are independent uniform$(0,1)$ variables, is given, according to the MathWorld link above, by $ f_n (x) = \frac{1}{{2(n - 1)!}}\sum\limits_{k = 0}^n {( - 1)^k {n \choose k}(x - k)^{n - 1} {\mathop{\rm sgn}} (x - k)},\;\; 0 < x < n. $
Normal approximation (general setting). Suppose that $X_1,X_2,\ldots$ is a sequence of independent and identically distributed random variables with mean $\mu$ and variance $\sigma^2$. Define $S_n = \sum\nolimits_{i = 1}^n {X_i }$. Then, by the central limit theorem, $ \frac{{S_n - n\mu }}{{\sigma \sqrt n }} \to {\rm N}(0,1), $ as $n \to \infty$. This means that, for any $x \in \mathbb{R}$, $ {\rm P}\bigg(\frac{{S_n - n\mu }}{{\sigma \sqrt n }} \le x \bigg) \to \Phi (x), $ where $\Phi$ denotes the distribution function of the ${\rm N}(0,1)$ distribution. Thus, $ {\rm P}(S_n \le n\mu + \sigma \sqrt n x) \to \Phi (x). $ Note that the first convergence above gives rise to the approximation $ S_n \approx n \mu + \sigma \sqrt{n} Z, $ where $Z \sim {\rm N}(0,1)$.
Back to your specific question, you just need to substitute the expectation and variance of a uniform$(0,1)$ random variable. In the context of normal approximation, consider also the four plots given in the MathWorld link above.