Consider Fisher's Linear Discriminant Analysis (LDA).
Let $\mu_0 \in \mathcal{R}^D$ and $\mu_1 \in \mathcal{R}^D$ be means of the two classes. Similarly, let $\Sigma_0 \in \mathcal{R}^{D\times D}$ and $\Sigma_1 \in \mathcal{R}^{D\times D}$ be the covariance matrices.
LDA considers the problem of finding a vector $w \in \mathcal{R}^D$ such that the following statistic is maximized:
$S = \frac{(w^T\mu_0 - w^t\mu_1)^2}{w^T(\Sigma_0+\Sigma_1)w}$
Does the dimension $D$ affect the above statistic? In practical terms, should I be comfortable comparing two values, $S_a$ and $S_b$, obtained from data in dimensions $D_a$ and $D_b \neq D_a$?