I'm starting to study the master theorem, why does something like
$T(n) = aT(n/b)+f(n)$
solves to
$f(n)^{\log_ba}$ ?
I'm a bit confused on the resolution
I'm starting to study the master theorem, why does something like
$T(n) = aT(n/b)+f(n)$
solves to
$f(n)^{\log_ba}$ ?
I'm a bit confused on the resolution
The right way to think about this is that we reduce the original problem, of size $n$, to $a$ separate problems of size $n/b$, and we do this recursively. At stage $i$, there are $a^i$ problems of size $n b^{-i}$. For each such problem, we need to do something which costs $f(x)$ on a problem of size $x$. Hence, the total cost at level $i$ is $a^i f(n b^{-i})$.
This recursion has $\log_b n$ levels, so the total work consumed at all levels is $ \sum_{i=0}^{\log_b n} a^i f(n b^{-i}) $
Depending on the growth of $f$, this is basically a geometric series.
For an in depth explanation and proof of the "Master Theorem" or "Master Method" of finding asymptotic solutions to recurrences, see MIT OCW's Introduction to Algorithms, Lecture 2.