Srivatsan has given a very fine answer, with a simple, elegant analysis. With a little more work, we can sharpen the result.
Claim: For $n \geq 3$, $\mathbb E X_N \sim \sqrt{\frac{\pi}{4} N}$ .
We can analyze this by means of the central limit theorem and the continuous mapping theorem. Below is just a sketch. We have restricted ourselves to the case $n \geq 3$ since the case $n = 2$ corresponds to the standard simple random walk, which has slightly different behavior (cf. Henning's comment). Intuitively, since for $n \geq 3$, the random walk can wiggle in an extra dimension, we should anticipate that its expected norm might grow slightly faster.
Proof (sketch):
Let $Z_i = (R_i,I_i)$, $i=1,2,\ldots$, be an iid (uniform) sample of the roots of unity where $R_i$ indicates the "real" component and $I_i$ the "imaginary" component of the $i$th element of the sample. Then, it is a simple exercise to verify that $\mathbb E R_i = \mathbb E I_i = 0$, and, also, $\mathbb E R_i I_i = 0$. Furthermore, $ \mathrm{Var}(R_i) = \mathrm{Var}(I_i) = 1/2 \>, $ independently of $n$, using simple properties of the roots of unity.
Hence, by the multivariate central limit theorem, we have that $ \sqrt{2N} (\bar{R}_N, \bar{I}_N) \xrightarrow{d} \,\mathcal \,N(0,\mathbf I_2 ) \> , $ where $\bar{R}_N = N^{-1} \sum_{i=1}^N R_i$ and likewise for $\bar{I}_N$. Here $\mathbf I_2$ denotes the $2 \times 2$ identity matrix.
An application of the continuous mapping theorem using $g(x,y) = x^2 + y^2$ yields $ 2 N (\bar{R}_N^2 + \bar{I}_N^2) = \frac{2}{N} X_N^2 = g( \sqrt{2N} \bar{R}_N, \sqrt{2N} \bar{I}_N ) \,\xrightarrow{d}\, \chi_2^2 \> . $
That is, the rescaled squared norm has a limit distribution which is chi-squared with two degrees of freedom.
The square-root of a $\chi_2^2$ distribution is known as a Rayleigh distribution and has mean $\sqrt{\pi/2}$.
Hence, by a second application of the continuous mapping theorem, $\sqrt{\frac{2}{N}} X_N$ converges to a Rayleigh distribution.
This strongly suggests (but does not prove) that $\mathbb E X_N \sim \sqrt{\frac{\pi}{4} N}$.
To finish the proof, note that $\mathbb E \frac{2}{N} X_N^2 = 2$ for all $N$. By a standard theorem in probability theory, there exists a sequence of random variables $\{Y_N\}$ such that $Y_N \stackrel{d}= \sqrt{\frac{2}{N}} X_N$ and $Y_N$ converges to $Y_\infty$ almost surely, where $Y_\infty$ is a standard Rayleigh. By the uniformity of the second moment above, we know that the set $\{Y_N\}$ is uniformly integrable and so $L_1$ convergent. So, $ \mathbb |\mathbb E Y_N - \mathbb E Y_\infty| \leq \mathbb E |Y_N - Y_\infty| \to 0 \> . $
Hence $\mathbb E Y_N = \mathbb E X_N \sim \sqrt{\frac{\pi}{4} N}$ as desired.