I'm not sure if this is the best place for this question, since it's programming related, but unfortunately, I don't have a better idea right now (CS looks too high level and I don't want to focus on implementation). I hope I managed to bring it down form computer science problem to a purely mathematical problem.
Anyway, I've got code that looks something like this:
a=1; b=n; while (a < b) do { for i=1, i < a, i++ do { /* blah blah, this is constant c*/ } a=a*3; b=b/2; }
I'm supposed to calculate the exact computational complexity (the minor stuff related to while and for loops isn't important in this question).
Now I know that the while will repeat a certain number of times and that number will be a logarithm and that's not the problem. My problem is the for loop part. I know that each time the loop repeats, I'll get $ \sum_{i=1}^{a} c $ repetitions of c and that sum will repeat $ \log_{6}{n}-1 $ times.
If the $a$ didn't change, I'd get $ac( \log_{6}{n}-1)$. Since I have constant in the sum, I could just multiply the constant by the upper bound and then multiply that by the number of times the whole sum will repeat.
In this case, however the upper limit of the sum changes each time. So how do I exactly calculate he number of times the $c$ gets added to itself?