1
$\begingroup$

In Glasserman's "Monte Carlo Methods in Financial Engineering", page 208, he states the following regarding antithetic variates:

Suppose that $Y = f(Z)$ with $Z = (Z_1,\ldots,Z_d) \sim N(0,I)$. Define the symmetric and antisymmetric parts of $f$, respectively, by $$ f_0(z) = \frac{f(z) + f(-z)}{2} \qquad \text{and} \qquad f_1(z) = \frac{f(z) - f(-z)}{2}. $$ Clearly, $f = f_0 + f_1$, and [after showing this gives an orthogonal decomposition of f] $$ \text{Var}[f(Z)] = \text{Var}[f_0(Z)] + \text{Var}[f_1(Z)] $$ The first term on the right is the variance of an estimate of $E(f(Z))$ based on an antithetic pair $(Z, -Z)$. Thus, antithetic sampling eliminates all variance if $f$ is antisymmetric ($f = f_1$) and it eliminates no variance if $f$ is symmetric ($f = f_0$)

(Boldface mine). I can see why antithetic variates would eliminate all variance in the estimator if $f$ is antisymmetric---the estimator is constantly zero---but I don't see the point of this orthogonal variance decomposition to see this, and I'm afraid I'm missing something important. My question is, what is the point of this variance decomposition?

1 Answers 1

0

The book is indeed slightly amiss in spelling out the connection. The point is that $\mathbf E[f_0]$ is the estimator of $\mathbf E[f]$ and what we are calculating. From Equation (1), we see that the estimator has a lower variance $$\text{Var}[f_0(Z)]\le\text{Var}[f(Z)]$$ and the inequality is strict when $f$ is not symmetric.