I have a question:
Does $N(0, x)$ mean a normal distribution with mean $0$ and variance $x$? Or standard deviation $x$?
The notation seems ambiguous sometimes.
I have a question:
Does $N(0, x)$ mean a normal distribution with mean $0$ and variance $x$? Or standard deviation $x$?
The notation seems ambiguous sometimes.
Some people use $N(0,x)$ to denote the normal with SD $x$, some with variance $x$. This depends on the author, so you should look for the first place where the paper or book that you're reading uses this notation; they should define it there.
This question came up on stats.SE as well. Apparently, there are three different conventions in use in the statistical literature, and for $x$ in $N(0,x)$ to mean standard deviation $\sigma$ is quite common (I am told Wolfram Alpha uses this convention, to answer a question from @MichaelHardy). What is more interesting is a comment on this question on Stats.SE which says that $x$ can even mean $1/\sigma^2$ in Bayesian contexts. This answer summarizes the discussion on stats.SE.
You are right, it might be confusing, though for 1-dimensional case only.
In general, for $n$-dimensional normal distribution you have $X\sim\mathcal{N}\left(\mu,\Sigma\right)$ where $\Sigma$ is the covariance matrix defined as $\Sigma_{ij}=\operatorname{Cov}\left(X_i,X_j\right)$ which is simply $\operatorname{Var}(X_i)$ when $i=j$. So, for 1-dimensional case it is just the variance. I believe this way it is easier to remember.