I would like to understand this paragraph:
Let $g(x):\mathbb{R}^{n}\rightarrow \mathbb{R}$ be a $x$-differentiable function and let $x_{1},x_{2},...$ be a sequence of measurement points for which the values $y_{1},y_{2},...$ of the function $g(x)$ is accessible to observation at every discrete-time $n=1,2,...$ and considering the aditive noise $\xi _{n}$ we have that \begin{equation*} y_{n}=g(x)+\xi _{n} \end{equation*} Then, considering the sequence of estimated values $\left\{ \hat{x}% _{n}\right\} $ the goal in stochastic optimization problems is to find a fixed point $x^{\ast }$ that satisfies the system of the extreme inclusion% \begin{equation*} \mathrm{E}\left\{ g(x)\right\} \underset{x\in X}{\rightarrow }\min \end{equation*}.
First what is the meaning of $\mathrm{E}\left\{ g(x)\right\} \underset{x\in X}{\rightarrow }\min$ this is a kind of expectation in probability or what? And second why do I need a "noise" in stochastic problems?
Hope someone could help me to understand this please.