Is there a way to get the branch and bound algorithm to converge to a solution "close" to an initial value?
One way I can think of, is to adding a "distance from initial value" term to the cost function. However, this makes the linear problem quadratic.
To give you a brief understanding of the problem I am working on:
Machines are to be allocated to various factories in certain packaging formats. Thus when the required capacity increases, my code throws up solutions where too many machines get relocated.
How do I get to the optimum solution which is closest to the current machine configuration?