1
$\begingroup$

I would like to understand this paragraph:

Let $g(x):\mathbb{R}^{n}\rightarrow \mathbb{R}$ be a $x$-differentiable function and let $x_{1},x_{2},...$ be a sequence of measurement points for which the values $y_{1},y_{2},...$ of the function $g(x)$ is accessible to observation at every discrete-time $n=1,2,...$ and considering the aditive noise $\xi _{n}$ we have that \begin{equation*} y_{n}=g(x)+\xi _{n} \end{equation*} Then, considering the sequence of estimated values $\left\{ \hat{x}% _{n}\right\} $ the goal in stochastic optimization problems is to find a fixed point $x^{\ast }$ that satisfies the system of the extreme inclusion% \begin{equation*} \mathrm{E}\left\{ g(x)\right\} \underset{x\in X}{\rightarrow }\min \end{equation*}.

First what is the meaning of $\mathrm{E}\left\{ g(x)\right\} \underset{x\in X}{\rightarrow }\min$ this is a kind of expectation in probability or what? And second why do I need a "noise" in stochastic problems?

Hope someone could help me to understand this please.

  • 0
    Can you tell where this is from? $E\{g(x)\}\underset{x\in X}{\rightarrow}min$ could mean minimization of the expectation of $g(x)$ w.r.t. $x\in X$. Noise plays the role of uncertainty of the observation. Without noise(or another source of randomness) this would be a deterministic problem and stochastic optimization would make no sense.2017-02-01
  • 1
    It's simply an uncommon, alternative way to write $$\begin{array}{ll} \text{minimize} & E\{g(x)\} \\ \text{subject to} & x\in X \end{array}$$ Nothing more complicated than that.2017-02-02

0 Answers 0