Assuming that you start out with a root node, and decide with 50% probability whether or not to add two children nodes. If they do, repeat this process for them. How can you find the average length of this random binary tree?
I'm thinking along the lines of
$\displaystyle\lim_{n\to\infty}\sum\limits_{i=1}^n (\frac{n}{2})^2$.
because 1/2 * n represents 50% probability, and I'm squaring it because the tree gets exponentially larger. However, I feel like I've done something terribly wrong (probably because I have). Can anyone give me some help?