I wrote the following answer to another question quite recently (link) and it seems relevant here too:
$\newcommand{\var}{\operatorname{var}}$ Variances are additive: for independent random variables $X_1,\ldots,X_n$, $ \var(X_1+\cdots+X_n)=\var(X_1)+\cdots+\var(X_n). $
Notice what this makes possible: Say I toss a fair coin 900 times. What's the probability that the number of heads I get is between 440 and 455 inclusive? Just find the expected number of heads ($450$), and the variance of the number of heads ($225=15^2$), then find the probability with a normal (or Gaussian) distribution with expectation $450$ and standard deviation $15$ is between $439.5$ and $455.5$. Abraham de Moivre did this with coin tosses in the 18th century, thereby first showing that the bell-shaped curve is worth something.
end of quote
There's more to the use of squares in statistics than that; there's also the whole topic of analysis of variance, where one decomposes sums of squares. Michael Chernick's answer addresses another aspect.
As for cubes, there are occasions to consider the sum of cubes of deviations from the mean: those are involved in skewness of distributions. But they don't work as a measure of dispersion. Here's a fact that seems not to be widely known; it's a simple exercise to prove it: Averages of cubes of deviations from the mean are also additive. (This doesn't work for $4$th powers or higher powers.)