5
$\begingroup$

Usually in probability theory, for a random variable whose value is in $\mathbb{R}$, we talk about its cumulative distribution function $F(x)$ and then its density $f(x)$, in good enough cases $F'(x)=f(x)$.

That's the setup I'm familiar with, so I got annoyed when physicists talk about unnormalized "densities". E.g. if the probabilistic density of the position of a particle on $\mathbb{R}$ is equal to 1 everywhere, that means it is equally likely to appear anywhere. More generally you can imagine them talking about a non-negative function $f(x)$ being the density of something with the density $f(x)$ is not integrable on $\mathbb{R}$ but locally integrable, i.e. $\int_{[a,b]}f(x)dx$ make sense for $a\leq b, a, b\in\mathbb{R}$. As $\int_{\mathbb{R}}f(x)dx$ is undefined ($=\infty$), one cannot divide by it to normalize.

Is there a mathematical way to make sense of such statements? There is one I have in mind, namely, one can talk about the density of some random variable $X$ up to a scalar multiple, such that for any intervals $[a,b]\subset [c,d]$ we can express the conditional probability as a quotient of integrals:

$$P(X\in[a,b] \big| X\in[c,d])=\dfrac{\int_{[a,b]}f(x)dx}{\int_{[c,d]}f(x)dx}.$$

I only know very basic probability theory so I don't know if this makes sense. Am I allowed to interpret unnormalized probabilistic density functions this way? Is this what physicists mean? Or are there any other interpretations? Do I have to worry about something else when thinking about things in this way?

  • 0
    OK, how about this: The frequency $p$ with which a coin turns up "heads" is unknown. The probability that it is in any particular set $A\subseteq[0,1]$ is $\displaystyle\int_A\frac{c\,dp}{p(1-p)}$, where we assume $c$ is some infinitely small positive number such that that integral is $1$ when $A=[0,1]$. Toss the coin $10$ times and get $7$ heads. Conclude that the conditional probability distribution of $p$ given that outcome is $(\text{constant}\cdot p^6(1-p)^2\, dp)$ (a Beta distribution). I've seen a sober physicist argue that that's the right thing to do when......2012-12-03
  • 0
    .....we don't initially know that either outcome is possible, and after we see one of the two possible outcomes, we don't yet know that the other one is possible.2012-12-03
  • 0
    @MichaelHardy What do you mean by this example? As an example for which what I tried doesn't work and physicists still like to think about? How do you "Conclude that the conditional probability distribution ..."? Did I miss something?2012-12-03
  • 0
    OK, I'll answer your last question first. If the prior probability measure is $f(p)\,dp$, and the likelihood function is $L(p) = \Pr(\text{observed data}\mid p)$, then the posterior probability measure, i.e. the conditional distribution of $p$ given the observed data, is $\text{constant}\cdot f(p)L(p)\,dp$.2012-12-03

1 Answers 1

1

The most coherent interpretation I know of what the physicists are doing in the real line example you mention is that they consider implicitely, for every positive $t$, a bona fide random variable $X_t$ with density $f_t$ where $f_t(x)=c_t^{-1}f(x)\,\mathbf 1_{[-t,t]}(x)$ and $c_t$ is the integral of $f$ on $[-t,t]$, and that they are convinced that the objects $X_t$ somehow converge when $t\to+\infty$ to... one does not know what kind of object exactly.

It happens that in most of the situations where (good) physicists are doing so, the properties of $X_t$ (of interest to them) somehow stabilize when $t\to+\infty$. Hence, although there is no random variable $X$ such that $X_t\to X$, nevertheless $X_t\approx X_s$ for every $s$ and $t$ large enough and this is all that is needed to proceed. So, in the end, there is (most often a pretty good amount of) reason in (them physicists') madness, although it may not be quite the brand of reason mathematicians are using.