I'm a software engineer. I asked a friend for help easily creating a somewhat-normal distribution. This is just for the purpose of adding entropy to an application.
The code he gave me is this:
(rand(-1000, 1000) + rand(-1000, 1000) + rand(-1000, 1000)) / 100
If I run this 1000 times, it produces results between -27 and 28, with a good-enough normal distribution:
Mean: -0.671 Median: -1 Std Dev: 9.9
I'm not as good at math as I should be. Can anybody help me understand why this works?