I'm trying to find the best estimates for a and b by fitting the equation below to the data given $(y_{t}, C_{t})$
$y_t=a*(1-e^{-b} ) / e^{bt} * \sum_{i=-20}^t {C_{x}e^{bx}+\gamma+\epsilon_t}$
where t=0 denotes the first month used in the regression.
Instead of solving for the normal equations, I computed the estimates by maximizing the total correlation coefficient for given values of $B$. So basically, I computed $(1-e^{-b} ) \sum{C_{x}e^{bx}}$, for each $b$ from $0$ to $1$ separated by increments of $0.01$ and then ran the regression above to obtain values of $a$. And took the estimates of $b$ and $a$ that gave me the highest correlation coefficient. However, I also need to compute confidence intervals for $a$ and $b$. But how would I go about computing the standard error for my estimate for $b$?