Introduction Algorithms Computational Experiments Warm Starting for Mixed Integer Linear Programs TED RALPHS MENAL GUZELSOY ASHUTOSH MAHAJAN SVETLANA OSHKAI ISE Department COR@L Lab Lehigh University [email protected]INFORMS Seattle November 2007 Thanks: Work supported in part by the National Science Foundation
33
Embed
Warm Starting for Mixed Integer Linear Programscoral.ie.lehigh.edu/~ted/files/talks/INFORMS07.pdf · Warm Starting for Mixed Integer Linear Programs TED RALPHS MENAL GUZELSOY ASHUTOSH
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
There are a wide range of applications in which we mustrepeatedly solve mixed integer linear programs.
If the instances are only slightly different, can we do better thanstarting from scratch?
In principle, warm starting techniques from LP can begeneralized to do this job.
Can we make it work in practice?
IntroductionAlgorithms
Computational Experiments
Applications
Iterative Algorithms
Bicriteria optimization algorithms
Primal heuristics (RINS)
Column generation algorithms for MILP
Dual decomposition algorithm for stochastic integerprograms
Real-time Optimization
Airline Disaster Recovery
Stochastic Vehicle Routing
Combinatorial Auctions
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
Warm Starting Information
Many optimization algorithms can be viewed as iterativeprocedures for satisfying optimality conditions (based on duality).
These conditions provide a measure of “distance from optimality.”
Warm starting information is additional input data that allows analgorithm to quickly get “close to optimality.”
In mixed integer linear programming, the duality gap is the usualmeasure.
A starting partition can quickly reduce the gap.
What is a starting partition and where do we get one?
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
Optimal Partitions
Consider the implicit optimality conditions associated with analgorithm branch and bound.
Let P1, . . . ,Ps be a set of polyhedra whose union contains thefeasible set.
Let Bi be the optimal basis for the LP minxi∈Pic⊤xi.
Then the following is a valid lower bound
L = min{cBi(Bi)−1b + γi | 1 ≤ i ≤ s}
where γi is a constant factor associated with the nonbasicvariables fixed at nonzero bounds.
A similar function yields an upper bound.
We call a partition that yields lower and upper bounds equal iscalled an optimal partition.
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
Bounding Functions
The function
L(d) = min{cBi(Bi)−1d + γi | 1 ≤ i ≤ s}
provides a valid lower bound as a function of the right-hand side.
Here is the corresponding upper bounding function
U(c) = min{cBi(Bi)−1b + βi | 1 ≤ i ≤ s, x̂i ∈ Π}
These functions can be used for local sensitivity analysis, just asone would do in linear programming.
For changes in the right-hand side, the lower bound remains valid.For changes in the objective function, the upper bound remainsvalid.One can also make other modifications, such as adding variablesor constraints.
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
Data Structures
To allow resolving from a warm start, we have a data structure forstoring warm starts.A warm start consists of a snapshot of the search tree, with nodedescriptions including:
lists of active cuts and variables,branching information,warm start information, andcurrent status (candidate, fathomed, etc.).
The tree is stored in a compact form by storing the nodedescriptions as differences from the parent.
Other auxiliary information is also stored, such as the currentincumbent.
A warm start can be saved at any time and then reloaded later.
The warm starts can also be written to and read from disk.
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
Warm Starting Procedure
After modifying parameters
If only parameters have been modified, then the candidatelist is recreated and the algorithm proceeds as if left off.
This allows parameters to be tuned as the algorithmprogresses if desired.
After modifying problem data
After modification, all leaf nodes must be modifiedappropriately and added to the candidate list.
After constructing the candidate list, we can continue thealgorithm as before.
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
Pruning the Warm Start
There are many opportunities for improving the basic scheme.
For instance, it may not be a good idea to start the warm startfrom the exact tree produced during a previous solve.
Any subtree will do.
Various ad hoc procedures can be used to prune the warm startto produce a smaller tree that may be more effective.
First p% of the nodes produced.All nodes above level p ∗ max, 0 ≤ p ≤ 1.Top k levels.
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
SYMPHONY: Support for Warm Starting
Currently supported
Change to objective function (no reduced cost fixing duringgeneration of warm start).
Change to right hand side (cuts are discarded whenresolving).
Changes to variable bounds.
Addition of columns.
Coming soon
Addition of constraints (easy).
Changes to right hand side without discarding cuts (not soeasy).
Changes to objective function with reduced cost fixing (notso easy).
IntroductionAlgorithms
Computational Experiments
DefinitionsImplementation
Basic Sensitivity Analysis
SYMPHONY will calculate bounds after changing the objective orright-hand side vectors.
Minor improvements inrunning time and optimalitygap
Not unexpected as onlyupper-bounds are affected
IntroductionAlgorithms
Computational Experiments
Combinatorial AuctionsPrimal Heuristics
Results: Particular instances
-560
-540
-520
-500
-480
-460
-440
-420
-400
0 1000 2000 3000 4000 5000 6000 7000 8000
Obj
ectiv
e V
alue
Time(s)
Time to feasible solutions: mkc
no-wsnode level ratio 0.1node level ratio 0.2node level ratio 0.5node level ratio 0.7node level ratio 1.0
5.4e+08
5.5e+08
5.6e+08
5.7e+08
5.8e+08
5.9e+08
6e+08
6.1e+08
6.2e+08
6.3e+08
6.4e+08
0 2000 4000 6000 8000 10000 12000 14000 16000
Obj
ectiv
e V
alue
Time(s)
Time to feasible solutions: sp98ar
no-wsnode level ratio 0.1node level ratio 0.2node level ratio 0.5node level ratio 0.7node level ratio 1.0
1140
1160
1180
1200
1220
1240
1260
1280
0 500 1000 1500 2000 2500
Obj
ectiv
e V
alue
Time(s)
Time to feasible solutions: aflow30a
no-wsnode level ratio 0.1node level ratio 0.2node level ratio 0.5node level ratio 0.7node level ratio 1.0
500
600
700
800
900
1000
1100
1200
1300
1400
1500
0 1000 2000 3000 4000 5000 6000 7000 8000
Obj
ectiv
e V
alue
Time(s)
Time to feasible solutions: swath
no-wsnode level ratio 0.1node level ratio 0.2node level ratio 0.5node level ratio 0.7node level ratio 1.0
IntroductionAlgorithms
Computational Experiments
Combinatorial AuctionsPrimal Heuristics
Conclusions and Future Work
Combinatorial AuctionsFor combinatorial auctions, using warm starting sped upcomputations consistently.It was necessary to start from scratch periodically in order to avoida loss of efficiency.
RINSUsing warm start improved time spent by a factor of 2-3 times formedium and (some) large instances.Overall impact was low, however.For very large instances, carrying over a single warm-startenvironment does not seem to be a good idea.Starting from scratch if several calls to heuristic are unsuccessfulmight be a good idea.Not having cuts and reduced cost fixing is a handicap.
In both cases, it was difficult to know how to prune the warm startintelligently.The bottom line: when solving modified instances of theseclasses of MILPs, warm starting does seem to be a good idea.