The thing to note when you're uniformly sampling directions from the unit sphere is that in three dimensions that means that $z$ is distributed uniformly. So you can ignore the sideways motion and effectively have a one-dimensional random walk with steps uniformly distributed in $[-1,1]$ (after scaling the step length to $1$).
The proportion of valid walks for given $n$ as a function of $z$ is piecewise polynomial of degree at most $n$, with one polynomial for each interval between two integers. If $n$ isn't too large, you can compute these polynomials, then do a not-too-wasteful form of rejection sampling by bounding them by a constant or linear or quadratic function over $[z_0-1,z_0+1]$. It will take a bit of bookkeeping to treat the boundary at $z=0$ correctly, but it shouldn't be too difficult.
Here's an idea which I think will only work for the almost valid case, but it might work well for that.
The problem is that the piecewise polynomial distributions get increasingly complicated when you convolve them with the uniform distribution on $[-1,1]$. So we'd like to replace them with something for which the convolution with the uniform distribution is easy to compute and doesn't increase the complexity. A sum of sinusoidals fits that bill, since each sinusoidal is just multiplied by a constant by the convolution.
Start from the end with a constant density on $[z_0-n,0]$. Convolve this with the uniform distribution on $[-1,1]$ some number of times, perhaps $100$ or $1000$, as many as you can afford to handle the polynomials for. Now it strongly resembles a Gaussian. Choose a sufficiently large interval centred on the mean of the distribution at $(z_0-n)/2$ such that both the bulk of the distribution and any $z$ values you expect to reach are well inside the interval, compute a partial sum of the Fourier series for the distribution on this interval, bound the magnitude of the remaining tail and add a corresponding constant to the partial sum to arrive at a Fourier sum that approximates the distribution and bounds it from above. Since the near-Gaussian is quite smooth, you won't need too many terms to get a good approximation.
This Fourier sum is now your new target. You can very easily propagate it to $n$, even for $n\gg1000$, since the convolution is just a multiplication. Start with $(n,z_0)$, pretending that you're trying to sample from the target distribution. Then when you reach the stage of the target distribution, do a single rejection step in which you reject the generated walk if a number drawn uniformly between $0$ and the Fourier sum lies above the correct distribution. This gives you the right distribution, and then you can proceed with the correct polynomials.
I wrote the answer below when I misread the question to refer to random walks on a lattice; I'm letting it stand since it might be interesting to others in its own right.
I like the question. (I used to do lattice gauge theory simulations.)
To do this, you need to know the number $a(n,z)$ of (almost) valid walks.
If you shift $z$ by $1$ so that a walk is valid if it ends at $z=0$, which simplifies the indexing slightly, the numbers $a(n,z)$ for valid walks are given by OEIS sequence A052179. The entry gives several prescriptions for generating them. I think the most efficient one for you is probably the recurrence relation,
$ a(n,z)=a(n-1,z-1)+4a(n-1,z)+a(n-1,z+1)\;, $
since you only need to compute $a(n,z_0)$ once at the beginning and store the intermediate results, which will contain all values you need later in the simulation.
Once you know the values $a(n-1,z-1)$, $a(n-1,z)$ and $a(n-1,z+1)$ at your current point, all you have to do is draw a random number uniformly between $1$ and $a(n,z)$ and choose where to go according to which of the six intervals of lengths $a(n-1,z_0-1)$, $a(n-1,z_0+1)$ and four times $a(n-1,z_0)$ it falls into.
The recurrence relation is the same for almost valid walks; only the initial conditions are different.