2
$\begingroup$

I am glad to have found this great site. There is a problem I am trying to solve for a while. I want to analyze the noise attenuation behavior of the bilateral filter. So given the unnormalized Gaussian function $$\phi(z)=\exp(-\frac{1}{2}z^2)$$ I define weights $$w(x)=\phi\left(\frac{x}{\sigma_s}\right)\phi\left(\frac{X_0-X_x}{\sigma_r}\right)$$ which are basically the product of two Gaussian functions. $X_x$ is a stationary random process which is independent and identically normal distributed for all $x$, i.e., $$\{X_x\sim \mathcal{N}(y,\sigma_n^2) \,:\, \forall x \in \mathbb{R}\}.$$ Now I am interested to find an analytic expression of the variance $\sigma_b^2$ defined as $$\sigma_b^2=Var\left\{\int\limits_{-\infty}^\infty \frac{w(x)}{\int_{-\infty}^\infty w(\tilde{x})d\tilde{x}}X_x dx\right\}$$ as a function of $(\sigma_n,\sigma_s,\sigma_r)$.

For the simpler linear filtering case $w(x)=\phi(\frac{x}{\sigma_s})$ I have got the solution, which is:

$$\sigma_b^2=\int\limits_{-\infty}^{\infty} \left(\frac{\exp\left(-\frac{x^2}{2\sigma_s^2}\right)}{\sqrt{2\pi}\sigma_s}\right)^2\sigma_n^2 dx =\frac{\sigma_n^2}{2\pi\sigma_s^2}|\sigma_s|\sqrt{\pi}=\frac{\sigma_n^2}{2\sqrt{\pi}\sigma_s }.$$

But I don't know how to use the bilateral weights defined above.

Any advise is highly appreciated.

  • 1
    The indication about $X$ is dubious since one needs a whole process $(X(x))_x$ and not just one random variable $X$. In fact it seems necessary to define the global distribution of this process $(X(x))_x$.2012-05-12
  • 0
    Didier, I have updated the problem formulation using a stochastic process instead of a random variable.2012-05-12
  • 0
    As I said in my first comment, one needs the *joint distribution* of the process $(X_x)_x$. This piece of information is still missing.2012-05-12
  • 0
    The effort you put into modifying your post is appreciated, but please, read what I wrote: to answer the question, one needs to know the distribution of $(X_x,X_y)$ for every $x\ne y$ (and much more!).2012-05-12
  • 0
    A subquestion that may be easier to answer: are the $X_x$ for different $x$ independent?2012-05-12
  • 0
    Rahul, yes the $X_x$ are all independent for different x and all $X_x$ are identically distributed. That's why I don't understand why I need to define the joint distribution of the process $(X_x)_x$. I use X indexed by x solely for the fact that $X_0$ shall be constant within the integral. But for the variance calculation also $X_0$ will be random.2012-05-13
  • 1
    **Because you never said so before**. // Unfortunately, i.i.d. random variables indexed by the reals cannot be integrated, see post below, hence the object which interests you is not defined.2012-05-13

1 Answers 1

1

Sobering fact: Consider some i.i.d. random variables $(X_x)_{0\leqslant x\leqslant 1}$. Then the integral $\displaystyle\int_0^1X_x\,\mathrm dx$ does not exist unless $\mathrm P(X_x=z)=1$ for some $z$.

To prove this, remember how integrals are defined. The upper and lower Darboux sums associated to a partition $\tau=(t_k)$ of $[0,1]$ are $$ U_\tau=\sum_k(t_k-t_{k-1})\sup\{X_x\mid t_{k-1}\leqslant x\leqslant t_k\}, $$ and $$ L_\tau=\sum_k(t_k-t_{k-1})\inf\{X_x\mid t_{k-1}\leqslant x\leqslant t_k\}, $$ respectively, and the integral exists if $U_\tau-L_\tau$ can be made as small as desired by imposing that the mesh of $\tau$ is small enough. And now, the bad news:

Let $M$ and $m$ denote the upper and lower limits of the support of the common distribution of the random variables $X_x$. Then $\sup\{X_x\mid s\leqslant x\leqslant t\}=M$ and $\inf\{X_x\mid s\leqslant x\leqslant t\}=m$ almost surely, for every $s\lt t$.

(Hint: Apply the law of large numbers to the events $A_x=[X_x\geqslant z]$ and $B_x=[X_x\leqslant z]$, for every given $z$.)

Thus, for every partition $\tau$, $U_\tau=M$ and $L_\tau=m$ almost surely. In particular, $\displaystyle\int_0^1X_x\mathrm dx$ does not exist as soon as $m\lt M$. For (non-degenerate) gaussian random variables, $m=-\infty$ and $M=+\infty$.

To conclude:

For i.i.d. (non-degenerate) random variables $(X_x)_x$, the object $\displaystyle\int_0^1X_x\mathrm dx$ does not exist.

Turning to Riemann sums $R(\tau,\xi)$, the situation is not any better. Recall that $$ R(\tau,\xi)=\sum_k(t_k-t_{k-1})X_{\xi_k}, $$ where $\tau=(t_k)$ is a partition of $[0,1]$ and $\xi=(\xi_k)$ is a tag of $\tau$ in the sense that $t_{k-1}\leqslant\xi_k\leqslant t_k$ for every $k$. Then, the law of large numbers shows that, as soon as the common distribution of the random variables $X_x$ is integrable with mean $m$, when the mesh of $\tau$ goes to zero, $R(\tau,\xi)$ converges to $m$ almost surely and in $L^1$. Here the limiting object exists (this is the real number $m$) but it is degenerate.

  • 0
    Dedier, thank you for your response. However, the problem I am facing is different to finding the solution of $\int_{-\infty}^\infty X_x dx$ since I want to solve $Var\left\{\int_{-\infty}^\infty \hat{w_x} X_x dx\right\}$ where $\int_{-\infty}^\infty \hat{w}_x dx = 1$. The problem I have however is, that $\hat{w}_x$ is a function of the random process $X_x$ itself. By the way, in numerical simulation the variance seems to converge for large lower and upper integral limits.2012-05-13
  • 0
    The question is rather simple: how to **define** $\int X_xdx$ or $\int w(x)X_xdx$? The point is that these integrals do not exist.2012-05-13
  • 0
    Do you agree, that the solution of the integral is a random variable $Y$ which is also normal distributed (since all $X_x$ are normal distributed and i.d.d.)? I want to find the variance of the random variable Y and think that there must be an analytical solution, since the variance converges in numerical simulations as I approach the integral limits towards infinity. And please also see the example solution above in the problem definition for the simpler case, when only using a Gaussian kernel in the integral.2012-05-13
  • 0
    And would it help to define the problem in the discrete domain? $$\sigma_b^2=Var\left\{\sum\limits_{-\infty}^\infty \frac{w(x)}{\sum_{-\infty}^\infty w(\tilde{x})d\tilde{x}}X_x dx\right\}$$ with $$\{X_x\sim \mathcal{N}(y,\sigma_n^2) \,:\, \forall x \in \mathbb{Z}\}$$2012-05-13