1
$\begingroup$

Supposing that n were known to have two prime factors, and that the computer had a database of all the primes $<\sqrt{n}$. Then, unless n is square, one factor would be $<\sqrt{n}$. If an algorithm were to reduce the number of possible primes $<\sqrt{n}$ which could be a factor of n by 1/2 with each step, then the worst-case-scenario computing time is minimised. As $n\rightarrow \infty$ the number of primes to check approaches $\frac{2\sqrt{n}}{\ln{n}-\ln{2}}$. Hence, if each step were to eliminate half the possibilities, the number of steps taken to factorise n would tend towards $\log_{2} {(\frac{2\sqrt{n}}{\ln{n}-\ln{2}})}$.

What similar lower bound would the factorising time tend towards if n were known to have 3 prime factors, or 4 prime factors?

  • 0
    Um, how exactly is it reducing the number of possible factors by 1/2 each step? No algorithm I know of does this.2011-10-05
  • 0
    Ignoring the point that Craig raised that this isn't actually possible, you could derive a bound by simply iterating the algorithm. If you have eg 4 factors, run it, find a factor. Divide n by this factor, repeat.2011-10-05
  • 0
    @Craig No known algorithm can. The point is that it is impossible for any algorithm to factorise n in fewer steps than that for every n.2011-10-06
  • 0
    Why is that? I'm not seeing that as being immediately obvious.2011-10-06

0 Answers 0