Steps of getting standard deviation. http://www.techbookreport.com/tutorials/stddev-30-secs.html:
Work out the average (mean value) of your set of numbers
Work out the difference between each number and the mean
Square the differences
Add up the square of all the differences
Divide this by one less than the number of numbers in your set - this is called the variance
Take the square root of the variance and you've got the standard deviation
Am I missing out something, or why do we need to square the differences in step 3?
Why not simply do a Abs
(multiply all negative numbers by -1) in step 3?
Also, my second question is why do we need to divide by one less than the number of numbers in the set in step 5? why not simply divide by the number of numbers?