Does there exist a sequence $\{a_n\}_{n\ge1}$ with $a_n < a_{n+1}+a_{n^2}$ such that $\sum_{n=1}^{\infty}a_n$ converges?
Does there exist a sequence with the same property but with each term positive?
Does there exist a sequence $\{a_n\}_{n\ge1}$ with $a_n < a_{n+1}+a_{n^2}$ such that $\sum_{n=1}^{\infty}a_n$ converges?
Does there exist a sequence with the same property but with each term positive?
Okay, it is indeed impossible. First we find an $n>1$ such that $a_n$ is strictly positive. The basic idea of the proof is noting that if say $a_n
Define two functions on the positive integers, $S: m\mapsto m^2$ and $A: m\mapsto m+1$ ($S$ stands for squaring and $A$ for adding). Then recursively define the following sets: $I_0=\{n\}$ and $I_{i+1}=S(I_i)\cup A(I_i)$, i.e. $I_k$ is a $k$-fold composition of $A$s and $S$s applied to $n$. Then it follows immediately from $n>1$ that the smallest element of $I_k$ is $n+k$. I now claim that $I_k$ has $2^k$ elements. To prove it, we use induction. It's certainly true for $I_0$. If it's true for $I_{k}$, then by injectivity of $S$ and $A$, it must be the case that $S(I_{k})$ and $A(I_{k})$ both have $2^{k}$ elements. Suppose they have an element in common. Then we can write that element as a perfect square of a number that's at least $n+k$ and also either as $n+k+1$ (which is impossible - $(n+k)^2>n+k+1$ since $n+k\geq2$ by assumption) or as $u^2+i$ for some integers $u$ and $i$ with $i
We now obtain the inequalities $a_n=\sum_{i\in I_0}a_i<\sum_{i\in I_1}a_i<\sum_{i\in I_2}a_i<....$, the only problem is that $I_i$ and $I_j$ may not be disjoint. So define a new sequence of finite index sets $J_k$ by $J_0=I_{0}$ and if $J_k$ has largest element $r$, then we define $J_{k+1}=I_r$ whose smallest element is $n+r$ so that $s
Suppose such a positive sequence $\{a_n\}_{n \in \mathbb{N}}$ exists.
Then by repeated application of the property $a_n < a_{n+1}+a_{n^2}$ on each term appearing starting from $a_2$, gives: $a_2 < a_3 + a_4 < a_4 + a_5 + a_9 +a_{16} < \ldots $ and so on.
Let, $\{S_k\}_{k\ge 1}$ be the resulting sequence of subscripts formed by such repeated applications. So, $S_1=\{2\},S_2=\{3,4\},S_3=\{4,5,9,16\},\ldots$ and so on.
The idea is to show that these $S_k$'s have no repeating elements, so that $\sum\limits_{n=1}^{\infty} a_n$ must diverge, because any tail of the series must be greater than $a_2 > 0$.
Now, Choose the smallest $l$, where a repetition happens in $S_l$. Then the repeating element must be of the form $m^2$, for some $m \in \mathbb{N}$, which comes from $m \in S_{l-1}$ and $m^2-1 \in S_{l-1}$.
But, then $m^2-1 \in S_{l-1} \implies m^2-2 \in S_{l-2} \implies \ldots \implies m^2-2m+2 \in S_{l-2m +2}$ since, none of $m^2-1,m^2-2,\ldots,m^2-2m+2$ are perfect squares. Thus we must have $l > 2m-2$.
Also, $m \in S_{l-1} \implies m > l-1$.
Which is absurd as $m$ is greater than $2$.
(Q.E.D.)