Added: As GEdgar points out in a comment, what the symbol $\sim$ means depends a lot on the branch of mathematics. From the overall question, I am guessing that the branch in this case is either combinatorics or discrete mathematics or algorithms. I think my definition below agrees with popular usage of $\sim$ in these areas. See this page for further details.
One fairly standard convention is that for nonnegative eventually positive1 functions $f(n)$ and $g(n)$ $ f(n) \sim g(n) \stackrel{\text{def}}{\iff} \lim_{n \to \infty} \frac{f(n)}{g(n)} = 1 \iff \frac{f(n)}{g(n)} = 1 + o(1) \iff f(n) = g(n) + o(g(n)), $ where "def" means that we are defining $\sim$ this way.
1Added: I made the functions eventually positive. If $g(n)$ is zero infinitely often, then the limit does not exist. This is not the most general way to define asymptotic notation, but I find the restriction convenient.
I first state a basic fact about this asymptotic notation: if $f_1(n) \sim g_1(n)$ and $f_2(n) \sim g_2(n)$, then
$ f_1(n) + f_2(n) = g_1(n) + o(g_1(n))+ g_2(n) + o(g_2(n)) = g_1(n)+g_2(n) + o(g_1(n) + g_2(n)), $ so that $f_1(n) + f_2(n) \sim g_1(n) + g_2(n)$.
Coming to your question, we need the additional two facts:
- If $n$ is large, then $H_n \sim \log n$, which comes the estimate $ H_n = \log n + \gamma + O(1/n) = \log n + O(1) = \log n + o(\log n). $
- If $i \sim \sqrt{n}$, then $ \log i = \log \sqrt{n} + \log \left(\frac{i}{\sqrt{n}} \right) = \log \sqrt{n} + \log (1+o(1)) = \log \sqrt{n} + o(1); $ $ \log(n-i+1) = \log n + \log \left( 1 - \frac{i-1}{n} \right) = \log n + \log (1 - o(1)) = \log n + o(1). $ Hence certainly $\log i = (1 + o(1)) \log \sqrt{n}$ and $\log (n-i+1) = (1+ o(1)) \log n$.
Applying the above facts, you can see that the above asymptotic analysis is sound.