Supposing that n were known to have two prime factors, and that the computer had a database of all the primes $<\sqrt{n}$. Then, unless n is square, one factor would be $<\sqrt{n}$. If an algorithm were to reduce the number of possible primes $<\sqrt{n}$ which could be a factor of n by 1/2 with each step, then the worst-case-scenario computing time is minimised. As $n\rightarrow \infty$ the number of primes to check approaches $\frac{2\sqrt{n}}{\ln{n}-\ln{2}}$. Hence, if each step were to eliminate half the possibilities, the number of steps taken to factorise n would tend towards $\log_{2} {(\frac{2\sqrt{n}}{\ln{n}-\ln{2}})}$.
What similar lower bound would the factorising time tend towards if n were known to have 3 prime factors, or 4 prime factors?