You need to know or be able to work out that the expectation of a sum is the sum of the expectations, that the sum of variances of independent random variables is the sum of the variances, and that the variance (or second central moment) is the second moment minus the square of the first moment.
We can consider each $X_i$ to be distributed as $\sum_{j=0}^k Y_{i,j}$ where each $Y_{i,j}$ is an independent Bernoulli random variable taking the value $1$ with probability $p$ and $0$ with probability $1-p$, and $E[Y_{i,j}]=p$. Similarly $E[Y_{i,j}^2]=p$ and $E[\left(Y_{i,j}-E[Y_{i,j}]\right)^2]=E[Y_{i,j}^2]-E[Y_{i,j}]^2 =p - p^2 = p(1-p)$.
So $E[X_{i}]=\sum_{j=0}^k E[Y_{i,j}]=kp$ and $E[\left(X_{i}-E[X_{i}]\right)^2]=\sum_{j=0}^k E[\left(Y_{i,j}-E[Y_{i,j}]\right)^2]=kp(1-p).$ But $E[X_{i}^2]=E[\left(X_{i}-E[X_{i}]\right)^2] + E[X_{i}]^2 = kp(1-p) + k^2 p^2$, which is really the result you want.
If you need it spelt out $E\left[ \frac{1}{n} \sum_{i=1}^n X_i^2 \right] = \frac{1}{n} E\left[ \sum_{i=1}^n \left(kp(1-p) + k^2 p^2\right) \right] = kp(1-p) + k^2 p^2$.