5
$\begingroup$

I have a question:

Does $$N(0, x)$$ mean a normal distribution with mean $0$ and variance $x$? Or standard deviation $x$?

The notation seems ambiguous sometimes.

3 Answers 3

3

Some people use $N(0,x)$ to denote the normal with SD $x$, some with variance $x$. This depends on the author, so you should look for the first place where the paper or book that you're reading uses this notation; they should define it there.

  • 0
    Can you cite any authors who use SD rather than var?2012-04-24
  • 1
    A lot of the time this is used in introductory statistics books; one example is Moore, McCabe, and Craig, _Introduction to the Practice of Statistics_, which I happen to be teaching from this semester.2012-04-24
3

This question came up on stats.SE as well. Apparently, there are three different conventions in use in the statistical literature, and for $x$ in $N(0,x)$ to mean standard deviation $\sigma$ is quite common (I am told Wolfram Alpha uses this convention, to answer a question from @MichaelHardy). What is more interesting is a comment on this question on Stats.SE which says that $x$ can even mean $1/\sigma^2$ in Bayesian contexts. This answer summarizes the discussion on stats.SE.

2

You are right, it might be confusing, though for 1-dimensional case only.

In general, for $n$-dimensional normal distribution you have $$X\sim\mathcal{N}\left(\mu,\Sigma\right)$$ where $\Sigma$ is the covariance matrix defined as $$\Sigma_{ij}=\operatorname{Cov}\left(X_i,X_j\right)$$ which is simply $\operatorname{Var}(X_i)$ when $i=j$. So, for 1-dimensional case it is just the variance. I believe this way it is easier to remember.