I have an expression which I think is $o(2^n)$, but I'm having difficulty simplifying it:
$o(2^n/n)\log(o(2^n/n) + n)$
I can ignore the extra $n$ sitting at the end, since $o(2^n/n) + n = o(2^n/n + n) = o(2^n/n)$, but then I can't say $\log(o(2^n/n)) = o(\log(2^n/n))$, can I? The rest would follow easily.
At least, it's not true in general that if $g(n)$ is $o(f(n))$ then $\log(g(n)) = o(\log(f(n)))$ (e.g., use $n$ and $n^2$). Is there some other way to do it? (Or is it plain not true?)
[EDIT] While it's not entirely relevant to the question, I suppose I should give the context. I'm working on an exercise from Gems of Theoretical Computer Science where you give a lower bound of $\Omega(2^n/n)$ on the circuit size of a boolean function generated from an incompressible string. So far, I have that if there are $g$ gates and $n$ variables, then the total circuit requires $g(c+2\log(g+n))$ bits to describe. Then supposing that the number of gates is $o(2^n/n)$, we should arrive that the total circuit description length (and hence the Kolmogorov complexity of the string) is $o(2^n)$, a contradiction.