1
$\begingroup$

Given these two computational complexities of 2 algorithms:

$\exp(O(\sqrt{\log n \log \log n}))$

$O(\sqrt{\exp n} / \log{ \sqrt{ \exp n} })$

where I imagine the first one goes to infinity slower than the second one, what could be a "simple" function $F(n)$ such that, when multiplied by the second complexity, makes it go to infinity with "same speed" of the first one ? (Lets say, more precisely, so that the ratio of the functions be asymptotically bounded)

[Note that the first expression comes essentially from Wikipedia: http://en.wikipedia.org/wiki/Integer_factorization#General-purpose, while the second would like to represent a simple process of division by the primes present before the square root of the number being factored]

  • 0
    My goal was to understand how much the best algorithms outperform an (imaginary) simple process of trial over the primes preceding $\sqrt(n)$, which are $\sqrt(n)/\log(\sqrt(n))$, assuming equal to $0$ the cost of producing and accessing the list of primes. (My expression has exp(n) because in the first one, n is the number of digits, and not the number being factored.)2012-12-17

0 Answers 0