1
$\begingroup$

What are the conditions under which a distribution reduces down a dimension?

For example, suppose I have a 2D gaussian distribution for X and Y. Under what condition(s) on Y does the distribution "reduce" down to a 1D gaussian distribution for X?

I don't want to say more so as not to bias the answers.

  • 0
    Constant $Y$ seems to meet the condition if you allow a constant to be described as Gaussian. So too does a bijective relationship between $X$ and $Y$ (i.e. knowledge of $X$ determines the value of $Y$ with probability 1) which preserves the Gaussian properties.2011-04-04
  • 0
    A constant Y would have a sigma of exactly zero. I can't figure out how to to reduce 2D gaussian to a 1D guassian in this case. I keep trying to use a limit as sigma goes to zero but it doesn't work right. I don't want to assume a relationship between X and Y.2011-04-04
  • 0
    a constant $Y$ would have a $\sigma \to \infty$...2011-04-04
  • 0
    I'm sorry. I don't follow that, Fabian. If Y is constant. Every time I measure it, I will get Y. The dispersion is therefore zero. Could you explain further?2011-04-04
  • 0
    I'm stumbled across the central limit theorem, which may contain the answer to my question. We'll see.2011-04-04
  • 0
    @Henry Isn't the _only_ bijective relationship between $X$ and $Y$ such that both $X$ and $Y$ are Gaussian a linear (or affine) relationship: $Y = aX+b$ for real numbers $a$ and $b$ with $a$ being allowed to be $0$ if we are amenable to calling a constant a Gaussian random variable with zero variance?2011-10-30

2 Answers 2