Note: first part is all context to question labeled "The Question" below:
Working through the CLRS Introduction to Algorithms, 3rd ed, in Chapter 6.4 as they are talking about heaps they state:
Each call to MAX-HEAPIFY costs $O(lg\ n)$ and BUILD-MAX-HEAP makes $O(n)$ such calls.
Ok, I'm with them, the running time for for BUILD-MAX-HEAP is therefore $O(n\ lg\ n)$, they go onto say:
This upper bound, though correct, is not asyptotically tight.
I understand this as well because really MAX-HEAPIFY doesn't operate on $lg\ n$ levels each time it's called, but rather it operates on $h$ levels where h is the height of the node it was called on. So they state:
Our tighter analysis relies on the properties that an $n$-element heap has height $\lfloor lg\ n \rfloor$ and at most $\lceil n/2^{h+1}\rceil$ nodes of any height $h$.
Ok, with them still as I know the entire tree has size $n$ nodes which is really using that formula when height=0: $n/2^0 = n$.
The time required by MAX-HEAPIFY when called on a node of height h is $O(h)$ and so we can express the total cost of BUILD-MAX-HEAP as being bounded from above by:
$\displaystyle\sum_{h=1}^{\lceil lg\ n\rceil} \lceil {\frac{n}{2^{h+1}}} \rceil O(h) = O(n\displaystyle\sum_{h=1}^{\lceil lg\ n\rceil} {\frac{h}{2^{h}}})$
The question
Here's where I'm lost, I don't get how the $^{+1}$ disappeared on $2^{h+1}$. Is that some type of unwritten assumption that can be made when you place the equation within the Big-oh? Or because they eliminated the ceiling function?