Suppose we have a bunch of objects with value, all of which are a power of $2$.
So $1, 2, 4, 8, 16, ...$
We are given some value $x \ge 1$ and we want to find the minimum amount of objects we can use that add up to this $x$.
I want to show that to obtain the minimum amount of objects, I have to pick the biggest possible object each time. This would be equal to
$k = 2^{\lfloor log_2(x)\rfloor}$
I have shown that if we pick the largest possible object each time, then we use at most 1 of each object. Now I'm not sure how to proceed. I could try to show that we require at least a certain amount of objects for each $x$, but this does not seem to be the way to go. I next thought to show that if we don't pick the biggest object, we run into problems. I was stuck however. It seems obvious for a concrete case but hard to show for an arbitrary one. On my current attempt this is where I'm stuck:
Let $x$ be any integer $\ge0$.
Suppose that $y_k$ is the greatest object we can choose. Suppose that $z_k$ is any other object we can choose.
Then $y_k = 2^jz_k$ for some $j\in\mathbb{N}$
Therefore if we choose $z_k$ rather than $y_k$, we also need to pick some ... but at this point my statement is wrong. I don't see what I can say, given that we could pick a smaller $z_k$ but still come out with the same thing.
I am still trying to think of other ways to approach this. Any hints would be appreciated. Thanks!
Edit: Just thought also, we could represent it like :
$x = y_0 + 2y_1 + 4y_2 + ... + 2^{n-1}y_{n-1}$ for $n = \lfloor log_2(x)\rfloor$
And possibly use calculus or sequences somehow?