Consider U(0,1), and lets shift down the density downwards (from the original 1) in the interval [0, U], and to balance it out, shift the density upwards (from the original 1) in the interval [U, 1] - call this new discontinuous random variable $D$ **. My intuition is unclear on whether variance increases. How can I calculate the second moment in this new distribution? Do I simply integrate $x^2*density1$ over [0, U] and add to that the integration of $x^2*density2$ over [U, 1]?
Furthermore, what if I'm interested in the overall standard deviation of $D$ PLUS another, independent, random variable Q. Do I just add the standard deviation of the previous R.V. $U$ (easily obtained by sqrt(secondMoment-0.5^2) PLUS the standard deviation of the standard deviation of $Q$?
SD(D+Q) ?= SD(D) + SD(Q)
**Or, maybe we can get even more sophisticated (I'm not familiar with real analysis) and shift up the density at a set of random disjoint real numbers in the interval [0,1] and shift down the rest (maybe every alternating real number is shifted in the opposite direction in the density: 1/2, 1.5, 1/2, 1.5,...).