1
$\begingroup$

I will have a small (usually 2 or 3, probably no bigger than 6) positive integer $n$
and symmetric positive-definite real $n$-by-$n$ matrices $A$,
and will want to sample an $\:\mathbf{R}^n$-valued$\:$ random variable $\mathbf{x}$ such that

for any/all linear map(s) $\: f : \mathbf{R}^n \to \mathbf{R}^n \:$ such that $\Big[$for all vectors $v$, $\;\;v^{\hspace{0.015 in}T} Av\:=\:\left(||f(v)||_2\right)^2\;\;\Big]$,
$\big[||f(\mathbf{x})||_2 = 1 \:$ almost surely$\big]$ and $\big[f(\mathbf{x})$ is distributed uniformly on the unit sphere$\big]$

.


The way I thought of to do this is to Gram-Schmidt orthonormalize the standard basis
with respect to the inner product induced by $A$, and then take the linear combination with
coefficients given as the entries of a uniform sample from the unit sphere. $\:$ However, since I
will be sampling from $\mathbf{x}$ only once for each $A$, I'm hoping for a rather more efficient method.

Is there any faster way to do the sampling I want?

(The matrices will be changing somewhat slowly, which might help.)

  • 0
    Yes, and I corrected the equation. $\:$2012-08-25

1 Answers 1

1

I think we can dispense with the "all linear maps" part, since all such linear maps are given by one such map followed by an arbitrary rotation of the unit sphere, which doesn't affect the uniform distribution; thus we can focus on a particular such map, $f(x)=A^{1/2}x$.

The problem is that you can easily apply $A$, but not $A^{-1/2}$. If the eigenvalues of $A$ are sufficiently similar and you have an efficient way to bound the least eigenvalue $\def\lmin{\lambda_{\text{min}}}\lmin\gt\kappa$ of $A$, you can get around this using rejection sampling: sample $x$ uniformly from the unit ball, reject the sample if $x^\top Ax\gt\kappa$, else divide it by $\sqrt{x^\top Ax}$ and use it. (Ideally, if you know $\lmin$, then $\kappa=\lmin$ yields the lowest rejection rate.)

  • 0
    Your first paragraph is the reason I wrote "any/all". $\:$2012-08-26