I have a simple problem that I think I may be over complicating, but I'm not certain. I work in a manufacturing environment as a process engineer and I'm setting up a calculator to determine how long it will take to clear our backlog from a certain step in our process.
There are two stages. We'll call them X
and Y
. Stage X
can process a certain number of products a day. And stage Y
can process a certain number of products a day, but not necessarily equal to stage X
.
When stage X
produces at a faster rate than stage Y
, a backlog B
is created.
To clear the backlog as it stands at any particular moment, it will simply take B/Y
. However, while that backlog is being cleared, more cases are being added to it at a rate of X
.
So to clear the current backlog and the backlog that is created while clearing the current backlog, it will take B/Y + ( (B/Y) * (X) ) / Y
.
But... while you clear that additional backlog, the backlog is still growing. So we have ourselves an infinite series (I think). My questions is how can I simplify this into an equation that I can use programmatically (like in Excel or something)?
I would think that calculating the number of days required to clear the backlog would only produce a number less than infinity if X
is less than Y
...