1
$\begingroup$

Let $m(\cdot)$ be a probability measure on $Z$, so that $\int_Z m(dz) = 1$.

Consider a continuous function $f: X \times Y \times Z \rightarrow \mathbb{R}_{\geq 0}$, where $X \subseteq \mathbb{R}^n$, $Y \subset \mathbb{R}^m$ is compact, $Z \subseteq \mathbb{R}^p$ is closed.

Define

$$ \hat{f}(x) \ := \ \inf_{y \in Y} \ \int_Z f(x,y,z) m(dz) $$

  1. Under which conditions is function $\hat{f}(\cdot)$ continuous?

  2. Instead of assuming continuity of $f(\cdot)$, let us assume that for each $z \in Z$ the map $(x,y) \mapsto f(x,y,z)$ is continuous. Now under which conditions we have continuity of $\hat{f}(\cdot)$?

Note: in the non-probabilistic case, $Y$ compact and $(x,y) \mapsto f(x,y)$ continuous are sufficient to establish continuity of $x \mapsto \inf_{y \in Y} f(x,y) $.

  • 0
    In (2.), I suppose $w=z$.2012-05-09
  • 0
    Sure it does. :)2012-05-09

1 Answers 1

1

Suppose

1) $f(x,y,\cdot)$ is $m$-measurable for all $x,y$,

2) $f(\cdot,\cdot,z)$ is continuous on $X \times Y$ for all $z \in Z$,

3) for each $x_0 \in X$ there is $r > 0$ such that $\int_Z g(z)\ m(dz) < \infty$ where $g(z) = \sup \{f(x,y,z): |x-x_0| \le r, y \in Y\}$

Then I claim it works. Suppose it didn't. Then there would be $\epsilon > 0$ and a sequence $x_n \to x_0$ in $X$ such that $|\hat{f}(x_n) - \hat{f}(x_0)| > \epsilon$. We may assume $|x_n - x_0| < r$ as given in (3).

Case 1: Suppose $\hat{f}(x_n) - \hat{f}(x_0) > \epsilon$ for infinitely many $n$. We can assume WLOG this is true for all $n$. There is some $y \in Y$ such that $ \int_Z f(x_0, y, z)\ m(dz) < \hat{f}(x_0) + \epsilon/2$, so $$\int_Z f(x_n,y,z)\ m(dz) - \int_Z f(x_0,y,z)\ m(dz) \ge \hat{f}(x_n) - \int_Z f(x_0,y,z)\ m(dz) > \epsilon/2$$ But this contradicts the Lebesgue Dominated Convergence Theorem.

Case 2: $\hat{f}(x_0) - \hat{f}(x_n) > \epsilon$ for infinitely many $n$. Again we can assume this is true for all $n$. For each $n$ there is $y_n \in Y$ such that $\int_Z f(x_n, y_n, z)\ m(dz) < \hat{f}(x_n) + \epsilon/2$. By compactness, some subsequence of $y_n$ converges to some $y_0 \in Y$. Again, we can assume this is the whole sequence. Then

$$ \int_Z f(x_0,y_0,z)\ m(dz) - \int_Z f(x_n,y_n,z)\ m(dz) \ge \hat{f}(x_0) - \int_Z f(x_n,y_n,z)\ m(dz) > \epsilon/2$$ which again contradicts the Lebesgue Dominated Convergence Theorem.

  • 0
    I guess that you say "there is some $y \in Y$ such that $\int_Z f(x_0,y,z) m(dz) < \hat{f}_0 + \epsilon/2$" because you are referring to http://math.stackexchange.com/questions/143124/continuity-of-parametric-integral. Right?2012-05-09
  • 1
    No. Whenever you have a set of real numbers with a finite infimum, at least one is within $\epsilon/2$ of that infimum.2012-05-09
  • 0
    I am not very clear with this comment, sorry. $x_0$ is fixed. You are saying that there exists $\bar{y} \in Y$ such that $\int_Z f(x_0,\bar{y},z) m(dz) < \inf_{y \in Y} \int_Z f(x_0,y,z) m(dz) + \epsilon/2$. It sounds to me that this property is not free and it is instead related to the continuity of $y \mapsto \int_Z f(x_0,y,z) m(dz)$ as established in http://math.stackexchange.com/questions/143124/continuity-of-parametric-integral. Can you explain it to me please?2012-05-09
  • 0
    For convenience, let me abbreviate $\int_Z f(x_0,y,z)\ m(dz)$ as $G(y)$. Let $B = \inf_{y \in Y} G(y)$. I am indeed saying there is $\overline{y} \in Y$ such that $G(\overline{y}) < B + \epsilon/2$. If there was no such $\overline{y}$, that would say $G(\overline{y}) \ge B + \epsilon/2$ for every $\overline{y} \in Y$, which would contradict the definition of infimum.2012-05-09
  • 0
    Of course I see this last comment. I was just wondering if the integral could play some strange role...2012-05-09
  • 0
    I've got another question. In the first case the contradiction regards the fact we are fixing $y$ and applying the LDCT to $f(x_n \rightarrow x_0, y, z)$. Fine. Now in the second case we have $f(x_n \rightarrow x_0, y_n, z)$. I guess the argument is working because $y_n\rightarrow y_0$, so $\lim_{n\rightarrow \infty} f(x_n,y_n,z) = \lim_{n\rightarrow \infty} f(x_n,y_0,z)$. Right?2012-05-10
  • 1
    In the second case you have simultaneously $x_n \to x_0$ and $y_n \to y_0$, so $f(x_n,y_n,z) \to f(x_0,y_0,z)$. The limit of $f(x_n,y_0,z)$ does not play a role (though it would also be $f(x_0,y_0,z)$).2012-05-10
  • 0
    So where are we contradicting the Lebesgue's Dominated Convergence Theorem then? Is it applied to $f(x_n,y_n,z)$ as $(x_n,y_n) \rightarrow (x_0,y_0)$?2012-05-10
  • 0
    Yes, that's right.2012-05-10
  • 0
    Ok, thanks a lot. You may be interested in http://math.stackexchange.com/questions/143704/continuity-of-expected-value-part-2, that is a similar "continuity problem".2012-05-10
  • 0
    It would be also interesting to prove the converse: the continuity of $\hat{f}(\cdot)$ defined as above implies that $f(\cdot)$ is uniformly bounded.2012-05-23
  • 0
    @Adam: How is that a converse? Who said anything about $f$ being uniformly bounded?2012-05-23
  • 0
    For uniform boundedness of $f$ I meant the $\mathcal{L}_1$ norm is bounded, i.e. your Assumption (3).2012-05-23
  • 0
    Can we relax your Assumption (3) by assuming Uniform Integrability and then apply the [http://en.wikipedia.org/wiki/Vitali_convergence_theorem] (Vitali's Convergence Theorem) instead of the [http://en.wikipedia.org/wiki/Dominated_convergence_theorem](Lebesgue's Dominated Convergence Theorem)?2012-06-05
  • 0
    Is condition $3)$ the weakest possible?2013-03-27