5
$\begingroup$

Just curious, what would be the computational complexity of computing $3^{n^n}$?

I am not sure what it would be like.

  • 0
    If you are computing $3^{n^n}$ modulo an integer $M$ and you know it's totient function $\varphi(M)$ then you can compute $n^n$ in $O(\log n)$ multiplications $\pmod{ \varphi(M)}$ and then rise $3$ to the result in $O(\log M)$ multiplications mod $M$ finding the result in polynomial time. The problem is to find $\varphi(M)$. Some times it can be done in polynomial time (for example if $M$ is prime then $\phi(M)=M-1$), but in general if $M$ is composite finding $\varphi(M)$ is equivalent to finding the factorization of $M$.2012-11-06

3 Answers 3

2

Let $M(n)$ denote the complexity of the multiplication of two integer with $n$-digits (let say in base two). Thanks to Schonhage and Strassen we know that $M(n) = O(n \log n \log\log n)$. Thanks to Fürer, we even know that it's a bit lower.

Let $a$ be an integer and $b$ an integer. Then $a^b$ can be computed in time $O(M(b \log_2 a))$ using fast exponentiation. This is not bad: since $a^b$ has bit-size exactly $b\log_2 a$, the cost of the computation is almost the size of the output.

So the complexity of computing $a^{b^c}$ is the cost of computing of $b^c$ plus the cost of computing $a^{b^c}$. When $a$ is at least 2, then the second operation dominate the first, and the total cost is $O(M(b^c \log_2 a))$.

We could make explicit the constant in the big-oh, and we would realize that is is not much larger than 2 or 3.

1

If I understand what it means.

The problem is it gets huge each time like the sequence is

$< 3, 81, 19683, 43046721,\ldots >$

The number of digits of the $n$th term is $\lceil n^n\log_{10}3\rceil$. So, $5$th and $6$th term has $1492$ and $22261$ digits respectively.

The $7$th, $8$th and $9$th term has $392930$, $8004767$ and $184846559$ digits respectively.

And $10^{10}$ has $11$ digits itself.

1

For simplicity, I will first assume that you are adopting a computational model that only keeps track of the number of multiplications (and that multiplication can be done in time independent of the number of integers).

We may then compute $3,3^2,(3^2)^2,\ldots, 3^{(2^n)}$ with $0,1,\ldots, n-1$ multiplications, respectively. As any number less than $2^{n+1}$ is a sum of terms in $\{1,2,\ldots, 2^n\}$, it suffices in the calculation of $3^{n^n}$ to first calculate $3^{2^j}$ for $j \leq \lfloor n \log_2 n\rfloor$, and then take a product of at most $\lfloor n \log_2 n \rfloor +1$ of these $3^{2^j}$ terms. To perform these preliminary calculations takes $\lfloor n \log_2 n \rfloor$ operations, and to multiply the end results requires another $\lfloor n\log_2 n \rfloor +1$ multiplications. We then conclude in time $\sim 2n \log_2 n$.

Of course, this first answer is silly because the time required to multiply integers depends heavily on the length. In fact, to merely output our answer requires time $O(n^n)$, as $3^{n^n}$ has $O(n^n)$-digits.

Now, let's assume it takes time at most $O(f(n))$ to square an $n$-digit number. Then calculation of $3^{2^i} \times 3^{2^i}=3^{2^{i+1}}$ takes time $O(f(2^i))$, and so we can compute $3^{2^j}$ for $j \leq \lfloor n \log_n n\rfloor$ in time $ O\left(\sum_{j=1}^{\lfloor n \log_2 n \rfloor} f(2^j)\right).$ For $f(n)= n\log n \log \log n$ (see here), this gives a total time of $\sum_{j=1}^{\lfloor n \log_2 n\rfloor}O(2^i \log 2^i \log \log 2^i)=\sum_{j=1}^{\lfloor n \log_2 n\rfloor}O(2^i i\log i)=O(n^{n+1}\log^2 n),$ using comparison with an integral and some Mathematica magic. This is quite reasonable, as we have already seen that the lower bound $O(n^n)$ must hold.

In general, one rarely obtains "exact" growth orders for the computational complexity of even simple algorithms (e.g. multiplication).