So I am asked to solve this question which says the following...
If I have a quantum computer which can make $1000^{10}$ numeric operations (addition and multiplication) per second, I am asked to estimate the time it takes a quantum computer to evaluate the determinant of a matrix of size $(50,50)$ by iterated cofactor expansion.
Some estimation steps I am apparently supposed to aim for here is, the number of operations is greater than $50!$, and $50!>10^{50}$ where $10^{50}=\sqrt{Googol}$.
I am also told that
$1$ $ns$= $1$ $nanosecond$ = $10^{-9}$
$1$ $\mu s$= $1$ $microsecond$ = $10^{-6}$
$1$ $ms$= $1$ $millisecond$ = $10^{-3}$
$1$ $Day$ $< 10^5$ $seconds$
$1$ $Year$ $< 10^8$ $seconds$
$Age$ $of$ $the$ $universe$: $<15 \cdot 10^9 < 10^{11}$ $years$
My approach was that I am already given the rate of calculations. It's $1000^{10}$ calculations per second. So I just need the number of calculations required. Then I can just multiply the two numbers to get the amount of time required.
The problem is though I can't figure out the number of calculations required. My assumption was to just use $50!$ which is close to $10^{50}$ or $\sqrt{Googol}$ but I don't think that's right since I have to account for all the multiplication of the cofactors of each minor and then all the additions.
How would I go about finding the number of calculations required for a $50$ X $50$ matrix?