Let's say I have a resource, call it E.
There are four important values
- Em - the maximum amount of E we can have
- Ec - the current amount of E
- Er - the rate at which E replenishes (this is in % of Em)
- Ea - the cost amount in E of an action
Let us assume that we start at max capacity.
at t=0 : Ec = Em at t=i : Ec = (Ec at i-1) - Ea + Er*Em aka curr = previous - action + replenish
How can I make a formula that models this behavior to determine at what t will I run out of resources.
I can make some estimates when I use values, let's say Em is 1000, Er is 0.01, and Ea is 50.
1000 / 50 = 20, so at t = 20, I will only have what I gained from my resource replenishment. 1000 * 0.01 = 10 * 20 = 200 more resources to spend. 200 / 50 = 4, so in 4 more t's, I would have acquired 40. I can estimate that between t=24 and t=25 I will run out of resources.
Estimating to within a t is acceptable, but doing it iteratively is a pain on my real datasets. I really don't remember how to turn an infinite series into a formula, so I turn to you guys.
Thanks!