Good morning!
I'm on my way to implement a deterministic (though unproven due to GRH) Miller primality test. On Wikipedia, it is said that it suffices to test all numbers in $[2, \lfloor\times2\ln^2n\rfloor]$
However, there is a problem that I only have bigint arithmetics and can afford neither a true natural logarithm function nor fractional numbers at all.
Ultimately, I only know the amount of bits in the tested number, which is actually $\lfloor\log_2n\rfloor+1$ (am I correct?).
Can anyone advise me what to do with this value to get as close as possible to the upper bound of Miller test, which is $\lfloor2\times\ln^2n\rfloor$, without getting away from the ring of integers?