I'm trying to compute some wavelet transforms and I seem to be running into some boundary condition errors. I have a randomly generated signal pictured at the top of the figure, and the real part of a scaled Morlet wavelet shown in the middle figure.
I have a list of the function values at a series of points ${t_0,t_1, t_2, ... t_N}$. If I compute a discrete correlation between the two functions pictured below I get the result I show in the third figure. I compute a correlation according to the following formula:
$z[k] = | \sum_{i=0}^{N-1} x[i] y^*[i+k] |$.
The formula assumes that my functions go to zero outside the region $[t_0, t_N]$. I believe the spikes in the end points of my correlation are errors from boundary conditions, but I'm not sure how to fix it.