1
$\begingroup$

Let's say I have many measures of a random variable - x. Moreover, I have many measures of y and z. Meaning I can approximate their PDF by the histogram of the measures.

How can I estimate the joint PDF of (x, y, z)? I don't know whether they are uncorrelated.

Thanks.

P.S. References to a practical demos and info on the subject are welcome.

  • 1
    Are the measures of $x$, $y$ and $z$ coupled? In other words, do you have the data in the form of triples $(x,y,z)$? If not, there's no way you can reconstruct the joint PDF.2011-02-05
  • 0
    Is there a way to estimate it approximately? It's a classification problem. I measure something and want too attribute it to one of few models. I have many other measurements of each other property. I want to calculate the probability of the specific measurement to be something given all others. Actually, I'm doing this on the first to the tenth moment. Thank You.2011-02-10
  • 0
    pdf are for continuous. I think you are looking at a discrete problem.2011-03-03

1 Answers 1

2

If you have measurements of $x$, $y$ and $z$ isolated, you can estimate the density functions (histograms, or whatever) of each one.. isolated. That is (assumming $x$, $y$ and $z$ are random variables that have a join density, i.e. that each ocurrence corresponds to a tuple $(x,y,z)$), you can estimate the marginal densities. And nothing more. The joint density is the product of the marginals if (and only if) the variables are independent. If you cannot assume that, and if you dont have more data, you can't estimate the joint density.

  • 0
    What if one does have data in tuples, but the observations occur over only a small subset of the total event space?2012-09-03
  • 0
    @TimSwast: then you'd probably use some parametric estimation; for example, you'd assume that the variables are jointly gaussian, and you'd estimate te unknown parameters from the data. Of course, you'd need some basis for that assumption.2012-09-04