So this is a pension framework. I am trying to code a system and I don't want to have to brute force this answer, but I can't figure out a clean solution. Essentially, I am calculating the total fund value at the time of retirement, given the following:
$$Fund = \sum_{i=1}^t [\cfrac{I\cdot e^{\frac{\pi i}{12K}}}{12K} \cdot C \cdot e^{\frac{Ri}{12K}}]$$
$I = $, annual income, $K = $ pay periods per month, $C =$ Contribution Rate (%), $R =$ expected annualized return (continuous), $\pi =$ expected annual income growth (continuous), $t = $ pay periods until retirement
Solving for the derivatives:
$$ \cfrac{dFund}{dC} = \sum_{i=1}^t [\cfrac{I\cdot e^{\frac{\pi i}{12K}}}{12K} \ \cdot e^{\frac{Ri}{12K}}]$$
$$\cfrac{dFund}{dR} = \sum_{i=1}^t [\cfrac{I\cdot e^{\frac{\pi i}{12K}}}{12K} \cdot C \cdot e^{\frac{Ri}{12K}} \cdot \frac{i}{12K}]$$
If $\Delta Fund_{C} = \Delta C \cdot \cfrac{dFund}{dC}$
How do I solve for $\Delta R$ if I want $\Delta R \cdot \cfrac{dFund}{dR} = \Delta Fund_{C} = \Delta C \cdot \cfrac{dFund}{dC}$?
Basically, is there a way to extract the value of the $\cfrac{i}{12K}$ term within the summation so that it can be expressed outside the summation?
Basically, I want to compute it as:
$$\cfrac{dFund}{dR} = C \cdot \Sigma(\frac{i}{12K}) \cdot \sum_{i=1}^t [\cfrac{I\cdot e^{\frac{\pi i}{12K}}}{12K} \cdot e^{\frac{Ri}{12K}}] = C \cdot \Sigma (\frac{i}{12K}) \cdot \frac{dFund}{dC}$$
The goal is that by doing so, the problem would easily simplify to
$$\Delta R \cdot C \cdot \Sigma (\frac{i}{12K}) \cdot \frac{dFund}{dC} = \Delta Fund_C$$
or, using substitution and rearranging,
$$\Delta R = \Delta C \cdot [C \cdot \Sigma (\frac{i}{12K})]^{-1}$$
which would be a much cleaner solution.
Currently I'm using my code to calculate $\Delta Fund_R$ for a large sequence of $\Delta R$ values and then matching the closest $\Delta Fund_R$ to $\Delta Fund_C$. Incredibly inefficient from a resource standpoint.
Is this mathematically possible?