We have a random variable $x$, with an unknown distribution. We want to find a transformation $y=f(x)$, where $f(x)$ is a monotonically increasing function, and $y$ has Gaussian distribution. We randomly pick $N$ samples of $x$, say $x_1,...,x_N$. How can we define $f$, given the values $x_1,...,x_N$, that if $N$ is large enough, $y$ becomes a Gaussian?
EDIT: I think the question was misunderstood. Let me clarify a little bit. We are given $N$ random samples from an unknown distribution. and we are asked to design a monotonically increasing function $f:\mathbb{R} \to \mathbb{R}$ so that$y=f(x)$ is Gaussian. One approach is to estimate the distribution of $x$ from the $N$ given samples, then map that to $N$ samples from a normal distribution for example $N(0,1)$, then for any given $x$, $f(x)$ can be found by interpolating the mapped values for known samples. Is there a simple way to do this ?