Guys I am having trouble with the standard normal distribution.
http://www.regentsprep.org/Regents/math/algtrig/ATS2/NormalLesson.htm
We know the X values run from approx $-\infty$ to $+\infty$ but what are the y values?? The normal distribution takes two parameters $\mathcal{N}(\mu, \sigma^2)$ but what is the range of y?
$y>0$ obviously and the "y" will depend on the mean and variance you picked as $y=\frac{\exp(-z^2)}{\sqrt{2\pi\sigma^2}}$. But I have trouble understanding what it means. If I take the S&P500 and I difference the series (SPX-SPX(-1)) the histogram of the returns will have an approximate normal distributions and will list out the number of times I have a return of -1%,-.5%,0%,.5%, 1% , etc throughout the history. So is the "y" of the normal distribution the number of times I have had that x as a value? Should I think of the normal distribution in practical terms the number of times that one point event has occurred? I look at some normal distributions and the Y ranges from 0-4, others I see the y ranging from 0 to 1, as a probability should. I know the area underneath the curve should sum to 1 but shouldnt the y values always be less than 1?
https://statistics.laerd.com/statistical-guides/standard-score.php
Thanks guys!