Suppose we are given a random vector $\mathbf x \in \mathbf{R}^n$ that has a multivariate normal distribution $\mathbf x \sim N(\mathbf \mu_0, \mathbf \Sigma_0)$, and we are also given a positive semidefinite matrix $\mathbf \Sigma_1\in \mathbf {R}^{n \times n}$.
Does there always exist a matrix $\mathbf A$ such that $\mathbf A\mathbf x = \mathbf y \sim N(\mu_0, \mathbf\Sigma_1)$? In other words, can we always find a linear transformation of $\mathbf x$ such that the resulting covariance matrix equals any positive semidefinite matrix?
We know that the covariance matrix of $\mathbf{A}x$ is going to be $\mathbf{A\Sigma_0A^T}$, so it seems that this question boils down to solving for $\mathbf A$ in the equation $\mathbf{A\Sigma_0A^T}= \mathbf{\Sigma_1}$, which I find myself not knowing how to do.
How can I solve this equation? Also, if my notation or terminology here is incorrect (or if there are conditions I need to solve this problem), please let me know.