Possible Duplicate: A proof that powers of two cannot be expressed as the sum of multiple consecutive positive integers that uses binary representations?
Suppose we have an arithmetic sequence $a_n= a_1 + (n-1)d$ such that $a_i \in \mathbb{N}$. If we let $d=1$, then we have consecutive natural numbers. Now, we are guaranteed that $\sum_{i=1}^{k}{a_i}= \lambda, \lambda \in \mathbb{N}$ Now, all elements of $\mathbb{N}$ are expressible into some $\sum_{i=1}^{k}{a_i}$ but it is not unique. Say $\lambda = 15 = 1+2+3+4+5= 7+8.$ But if $\lambda = 2^n $ $\forall$ $ n \in \mathbb{N}^* $, there is no $\sum_{i=1}^{k}{a_i}$ possible where $d=1$.
How can we show that all natural numbers of the form $2^n $ $\forall$ $ n \in \mathbb{N}^* $ cannot be written as sums of consecutive natural numbers?