(I asked this question in stackoverflow.com, but I am now thinking my mistake may be mathematical rather than programming).
I am simulating geometric brownian motion, using closed-form solution for the value after an arbitrary (not necessarily infinitesimal) step dt. My drift = 0, standard deviation = 1, initial value = 1.
Of course, the expected value at any future point of such GBM is 1.
But, after 100 steps, I find my GBM process to be in the range between 10^-30 and 10^-15. Typically, it's around 10^-23, and falling consistently. It seems like each step my process tends to fall by a constant factor of around 1.7. If I double my standard deviation, my process falls twice faster (~3.4 times each step).
I'm sure something is wrong with my setup. Here's my Python code, which is nearly readable as pseudocode:
import math import random p = 1 dt = 1 mu = 0 sigma = 1 for k in range(100): # repeat loop 100 times dW = random.normalvariate(0, math.sqrt(dt)) # draw a standard normal random value p = p * exp((mu - sigma * sigma / 2) * dt + sigma * dW) print(p)
Thanks...
Edit: I fixed normalvariate to use sqrt(dt) rather than dt * dt. Thanks El Moro.