This is a practice problem for a midterm in a real analysis undergrad class, but I tagged it as homework anyway.
Suppose that $a_k>0$ and that the series $\sum_{k=1}^\infty a_k$ diverges. Let $S_n$ = $\sum_{k=1}^n a_k$ and define $b_1 = a_1$ and $b_n = \sqrt{S_n}-\sqrt{S_{n-1}}$ for $n\ge2$. Prove that $\sum_{k=1}^\infty b_k$ diverges, and that $\lim_{x\to\infty}\frac{b_n}{a_n}=0$. Conclude that there is no universal "smallest" comparison series to test divergence.
So $S_n = S_{n-1}+a_n$ and as n increases that $a_n$ will diverge. The square roots are messing me up, I don't know how to start with the limit and I understand the last statement, but I don't see how I can conclude it.