I start apologizing if I'm gonna say things you already know. But it's better to make things clear.
The notion of differentiability in higher dimension is a little tricky. Indeed, as you know, the fact that $f$ has all the partial derivatives, is not enough to guarantee differentiability. Even if $f$ has directional derivatives for all possible directions, it might still be not differentiable (a bit pathological, but can happen).
No, in order to check that a function is differentiable, you need to use the definition: $f$ is said to be differentiable if there is a linear function(al) $L$ (the differential) such that the limit
$\lim_{|\underline{h}|\to0} \frac{f(\underline{x}+\underline{h})-f(\underline{x}) - L(\underline{h})}{|\underline{h}|}\tag{1}$
exists and is finite. Note that the increment $\underline{h}$ is generic; every component might go to zero with the same rate (directional derivative) or one might go to zero faster than the other ones. Note also that, since the only possible linear functional on $\mathbb{R}^n$ are scalar products, if the function turns out to be differentiable, then you can give a face to $L$: it is just $L(\underline{h}) = <\nabla f,\underline{h}>$, where $<\cdot,\cdot>$ is the scalar product in $\mathbb{R}^n$ and $\nabla f$ is the gradient of $f$ (the vector with the partial derivatives). If you will take functional analysis one day you will learn that the gradient of $f$ can be seen as the Riesz element that represents the differential of $f$ in $\mathbb{R}^n$. But for now, forget about this last comment.
Back to the problem. Let's compute the differential for $f(\underline{x}) = \sqrt{x_1^2+\cdots+x_n^2}=|\underline{x}|$.
By computing the partial derivatives (easy exercise), our guess on $L$ is
$L(\underline{h}) = \sum \frac{x_i}{|\underline{x}|}h_i$
which are nothing else but the components of the gradient multiplied by the components of $\underline{h}$ (note that we can't say $L(\underline{h}) = <\nabla f,\underline{h}>$. That's just a guess! We need to prove it.). Clearly, this function(al) is linear in $\underline{h}$. Now let's check that this satisfies the equation (1). Notation: every time an index appears twice, once on top, once on the bottom, it means "sum over that index" ($x^iy_i=\sum x_iy_i$). We'll use the classic trick "multiply and divide". Careful, it will look a bit messy.
$\lim_{|\underline{h}|\to0} \frac{f(\underline{x}+\underline{h})-f(\underline{x})-L(\underline{h})}{|\underline{h}|} = \lim_{|\underline{h}|\to0} \frac{\sqrt{(x^i+h^i)(x_i+h_i)}-\sqrt{x^ix_i}-\dfrac{x^i}{|\underline{x}|}h_i}{|\underline{h}|}=$
$=\lim_{|\underline{h}|\to0} \frac{\sqrt{(x^i+h^i)(x_i+h_i)}-\sqrt{x^ix_i}-\dfrac{x^i}{|\underline{x}|}h_i}{|\underline{h}|}\frac{\sqrt{(x^i+h^i)(x_i+h_i)}+\sqrt{x^ix_i}}{\sqrt{(x^i+h^i)(x_i+h_i)}+\sqrt{x^ix_i}}=$
$=\lim_{|\underline{h}|\to0} \frac{(x^i+h^i)(x_i+h_i)-x^ix_i-\dfrac{x^i}{|\underline{x}|}h_i\left(\sqrt{(x^i+h^i)(x_i+h_i)}+\sqrt{x^ix_i}\right)}{|\underline{h}|\left(\sqrt{(x^i+h^i)(x_i+h_i)}+\sqrt{x^ix_i}\right)}=$
$=\lim_{|\underline{h}|\to0} \frac{2x^ih_i+h^ih_i-\dfrac{x^i}{|\underline{x}|}h_i\left(\sqrt{(x^i+h^i)(x_i+h_i)}+\sqrt{x^ix_i}\right)}{|\underline{h}|\left(\sqrt{(x^i+h^i)(x_i+h_i)}+\sqrt{x^ix_i}\right)}$
The term $h^ih_i/|\underline{h}|$ is going to zero ($h^ih_i=|\underline{h}|^2$), so forget about it. Rearranging the rest (and write $|\underline{x}|$ instead of $\sqrt{x^ix_i})$ we get
$\ldots = \lim_{|\underline{h}|\to0} \frac{x^ih_i}{|\underline{h}||\underline{x}|} \frac{2|\underline{x}|-\left(|\underline{x}+\underline{h}|+|\underline{x}|\right)}{|\underline{x}+\underline{h}|+|\underline{x}|}$
Now, the second fraction is approaching $0$ as $|\underline{h}|\to0$ for every $|\underline{x}|\neq\underline{0}$. The first fraction is just
$\dfrac{<\underline{x},\underline{h}>}{|\underline{h}||\underline{x}|}$
and thus is bounded in $[-1,1]$ due to the Cauchy-Schwarz inequality. Hence, by squeeze theorem, the limit is $0$. Q.E.D.
For the second function, you can do it yourself, just multiplying and dividing by something else.
Hope this helps.