I am working with sampling of multivariate normally distributed numbers. I have a very fundamental question regarding the eigendecomposition of the $k \times k$ covariance matrix $\Sigma = \mathbf{U}\mathbf{\Lambda}\mathbf{U}^{-1} = \mathbf{U}\mathbf{\Lambda}\mathbf{U}^{T}$, where $U = [\mathbf{u}_1 \mathbf{u}_2 \dots \mathbf{u}_k]$ is an orthonormal (or at least orthogonal) matrix with eigenvectors and $\mathbf{\Lambda}$ is a diagonal matrix with the corresponding eigenvalues on the diagonal.
The eigendecomposition can also be further decomposed into $\Sigma = \mathbf{U}\mathbf{\Lambda}\mathbf{U}^{T} = ({\mathbf{U}\Lambda}^{1/2}){({\mathbf{U}\Lambda}^{1/2})}^{T}$.
But there is also the Cholesky factorization: $\mathbf{\Sigma} = \mathbf{L}{\mathbf{L}}^{T}$.
For real, symmetric, and positive-definite matrices $\mathbf{\Sigma}$ - what is the general relationship between $\mathbf{U}{\Lambda}^{1/2}$ and $\mathbf{L}$?
What I want to do, is to use $\mathbf{U}{\Lambda}^{1/2}$ (or $\mathbf{L}$, but I believe it might be wrong) to generate normally distributed numbers. Let's say that I have a $n \times k$ matrix $\mathbf{X}$ with $n$ points $\mathbf{x}_i \in \mathbb{R}^k$ as rows. These points are drawn from $N(\mathbf{0}, \mathbf{I})$ (done with a pseudo-random multivariate generator).
The following relationship exists: $X \sim N(\mathbf{\mu}, \mathbf{\Sigma}) \iff X \sim \mathbf{\mu}+\mathbf{U}{\Lambda}^{1/2}N(\mathbf{0}, \mathbf{I}) \iff X \sim \mathbf{\mu}+\mathbf{U}N(\mathbf{0}, \mathbf{\Lambda})$ for random variable $X$ taking its value from a multivariate normal distribution with mean $\mathbf{\mu}$ and covariance $\mathbf{\Sigma}$.
Using this knowledge I then apply $\mathbf{U}{\Lambda}^{1/2}$ as a scaling and rotation operator on $\mathbf{X}^T$ and my data are then distributed as $N(\mathbf{\mu}, \mathbf{\Sigma})$ (as far as I understand).
Can I use $\mathbf{L}$ here instead as the operator? And what does it mean if I do?
Thanks.