Let $Z = X+Y$. Then, if $X$ and $Y$ have values in $[0,1]$, it follows that $0 \leq Z \leq 2$.
Thus, it is obvious that the cumulative probability distribution function
$F_Z(z) = P\{Z \leq z\}$ enjoys the property that $F_z(z) = 0$ for $z < 0$ and $F_Z(z) = 1$
for $z \geq 1$. More generally, for any fixed value of $z$,
$$F_Z(z) = P\{Z \leq z\} = P\{X+Y \leq z\}
= \int_{-\infty}^{\infty}\left[ \int_{-\infty}^{z-x}
f_{X,Y}(x,y)\,\mathrm dy\right]\,\mathrm dx$$
and so, using the rule for differentiating under the integral sign
(see the comments following this answer
if you have forgotten this)
$$\begin{align*}
f_Z(z) &= \frac{\partial}{\partial z}F_Z(z)\\
&= \frac{\partial}{\partial z}\int_{-\infty}^{\infty}\left[ \int_{-\infty}^{z-x}
f_{X,Y}(x,y)\,\mathrm dy\right] \,\mathrm dx\\
&= \int_{-\infty}^{\infty}\frac{\partial}{\partial z}\left[ \int_{-\infty}^{z-x}
f_{X,Y}(x,y)\,\mathrm dy\right]\,\mathrm dx\\
&= \int_{-\infty}^{\infty}
f_{X,Y}(x,z-x)\,\mathrm dx
\end{align*}$$
When $X$ and $Y$ are independent random variables, the joint density
is the product of the marginal densities and we get the convolution
formula
$$f_{X+Y}(z) = \int_{-\infty}^{\infty}
f_{X}(x)f_Y(z-x)\,\mathrm dx ~~ \text{for independent random variables}
~X~\text{and}~Y.$$
When $X$ and $Y$ take on values in $[0,1]$, we have that $f_X(x) = 0$
for $x<0$ and $x>1$, and so
$$f_{X+Y}(z) = \int_{0}^{1} f_{X}(x)f_Y(z-x)\,\mathrm dx.$$
Furthermore, for fixed $z, 0 < z < 1$, as $x$ sweeps from $0$ to
$1$, $f_Y(z-x)$ goes to $0$ as soon as $x$ exceeds $z$, and so
$$f_{X+Y}(z) = \int_{0}^{1} f_{X}(x)f_Y(z-x)\,\mathrm dx, ~~ 0 \leq z \leq 1.$$
Similarly, if $z \in [1,2]$, $f_Y(z-x) = 0$ as long as $x < z-1$
and so
$$f_{X+Y}(z) = \int_{z-1}^{1} f_{X}(x)f_Y(z-x)\,\mathrm dx, ~~ 1 \leq z \leq 2.$$
Finally, if $X$ and $Y$ are uniformly distributed on $[0,1]$, the
integrands above have value $1$ and we get
$$f_{X+Y}(z) = \begin{cases}
z, & 0\leq z \leq 1,\\
2-z, & 1 \leq z \leq 2,\\
0, &\text{otherwise.}\end{cases}$$