If you divide all the values by the standard deviation, then you will then have a distribution with a standard deviation equal to $1$ (and so a variance equal to $1^2 = 1$). The difference is that the mean is not $0$, unless it was originally.
You seem to be confusing the variance with (half) the range. The range is the difference between the minimum possible value and the maximum possible value. The variance is the expected value of the square of the differences between the actual values and the mean of the distribution, and the standard deviation is the square root of the variance.
Except in a special case, the range will be more than twice the standard deviation: the special case is where the distribution only takes two values, and takes those with equal probability. So if you have a standardized distribution with mean $0$ and standard deviation $1$ then you will usually find some values outside the interval $[-1,1]$: for a standard normal distribution, there is a probability over 31% that a value will be outside that interval.