8/11/2019 WI4_12
1/33
Discrete (and Continuous) Optimization
WI4 131
Kees Roos
Technische Universiteit Delft
Faculteit Electrotechniek, Wiskunde en Informatica
Afdeling Informatie, Systemen en Algoritmiek
e-mail: [email protected]: http://www.isa.ewi.tudelft.nl/ roos
November December, A.D. 2004
8/11/2019 WI4_12
2/33
Course Schedule
1. Formulations(18 pages)
2. Optimality, Relaxation, and Bounds (10 pages)
3. Well-solved Problems(13 pages)
4. Matching and Assigments (10 pages)
5. Dynamic Programming(11 pages)
6. Complexity and Problem Reduction (8 pages)
7. Branch and Bound (17 pages)
8. Cutting Plane Algorithms(21 pages)
9. Strong Valid Inequalities (22 pages)
10. Lagrangian Duality(14 pages)
11. Column Generation Algorithms (16 pages)
12. Heuristic Algorithms (15 pages)
13. From Theory to Solutions (20 pages)
Optimization Group 1
8/11/2019 WI4_12
3/33
Capter 12
Heuristic Algorithms
Optimization Group 2
8/11/2019 WI4_12
4/33
Introduction
Many practical problems are NP-hard. In these case one may choose (or even be forced) to
use a heuristic orapproximation algorithm: a smart method that hopefully finds a goodfeasible solution quickly.
In designing a heuristic, various questions arise:
Should one accept any feasible solution, or should one aska posteriorihow far it is from
optimal?
Can one guaranteea priorithat the heuristic produces a solution within (or%) from
optimal?
Can one guaranteea priorithat for the class of problems considered the heuristic on the
average produces a solution within(or%) from optimal?
Optimization Group 3
8/11/2019 WI4_12
5/33
Greedy and Local Search Revisited
We suppose that the problem can be written as a COP in the form:
minSN{c(S) : v(S)k} .
Consider for example the 0-1knapsack problem:
z = max n
j=1
cj
xj
:n
j=1
aj
xj
b, x {0, 1}n .Herexis the indicator vector of the set S. Thus, defining
c(S) = jS
cj, v(S) = jS
aj, k =b
the 0-1 knapsack problem gets the above form. Also the uncapacitated facility locationproblem
miniM
jN
cij
xij
+ jN
fj
yj
:
n
j=1
xij
= 1, iM
xij
|M| yj
, xij
[0, 1], yj
{0, 1}fits in this model if we take
c(S) = iM
minjS
cij+ jS
fj, v(S) =|S| , k = 1.
Optimization Group 4
8/11/2019 WI4_12
6/33
Greedy Heuristic
minSN{c(S) : v(S)k} .
Step 1: Set S0 =andt= 1.
Step 2: Set jt= argmintc
St1{jt}
c
St1
v(St1{jt})v(St1).
Step 3: IfSt1 is feasible, and the cost does not decrease when passing to St =St1
{jt}, stop with SG =St1.
Step 4: Otherwise, set St = St1 {jt}. If St is feasible and the cost function is
nondecreasing or t=n: stop with SG =St.
Step 5: Ift=n, no feasible solution has been found. Stop.
Step 6: Set tt + 1, and return to Step 2.
Optimization Group 5
8/11/2019 WI4_12
7/33
Example: Uncapacitated Facility Location
Consider the UFL m= 6clients and n= 4depots, and costs as shown below:
(cij) =
6 2 3 4
1 9 4 1115 2 6 39 11 4 87 23 2 94 3 1 5
and (fj) = (21, 16, 11, 24)
Greedy solution:
Step 1: Set S0 =andt= 1. S0 is infeasible.
Step 2: We compute j1= argmintc({jt})v({jt})
.
One has c({1}) =(6 + 1 + 15 + 9 + 7 + 4)+ 21 = 63, c({2}) = 66,
c({3}) = 31, c({4}) = 64. So j1 = 3, S1 ={3} and c(S1) = 31. S1 is
feasible.
Step 3: SinceS1 is feasible, we check if the cost decreases when passing to S2 =S1 {j}
for somej /S1
. One hasc({3, 1})c({3}) = 18,c({3, 2})c({3}) = 11,andc({3, 4}) c({3}) = 11. So the cost is nondecreasing: hence we stop with
SG =S1 ={3}.
In many cases the heuristic must be adapted to the problem structure. We give an examplefor STSP.
Optimization Group 6
8/11/2019 WI4_12
8/33
Example: Symmetric TSP
9 2 8 12 119 7 19 10 322 7 29 18 68 19 29
24 3
12 10 18 24 1911 32 6 3 19
Distance matrix.
Greedy Solution: The pure greedy heuristic has been appliedto this instance in Chapter 2. Here we use a nearest neigh-borhood insertion heuristic: Starting from an arbitrary node wesubsequently build paths Pt containing that node by insertingthe node nearest to this path in the current path, until we obtaina tour (this happens after n 1insertions).
Let us start at node 1. The nearest neighbor of node1is3, yieldingP1= 1 3. The nearest neighbor
of the node set {1, 3} is 6, with distance 6 to 3,yielding the three possible paths shown in the table.We use thecheapest insertion: P2 = 1 3 6.The nearest neighbor of the node set {1, 3, 6}is 4,with distance 6 to node 6. Possible insertions are
shown in the table. Now node 2 is nearest to P3,and the possible insertions are as indicated. We use :P4 = 2 1 3 6 4, which has length 20.Finally, the last node (node5) must be connected tothis path so as to obtain a tour. This can be done in5ways:
2 1 3 6 4
5 5 5 5
5
9 2 6 3
10 12 18 19
12 18 19 24
10 24 19
node node insertions length new path1 1 0 P0
3 1 3 P16 6 1 3 136 1 6 3 176 1 3 6 8 P24 4 1 3 6 164 1 4 3 6 434 1 3 4 6 344 1 3 6 4 11 P3
2 2 1 3 6 4 20 P42 1 2 3 6 4 252 1 3 2 6 4 442 1 3 6 2 4 592 1 3 6 4 2 30
possible insertions of node 5 increment1 3 6 4 2 5 1 19 + 10 + 12 9 = 32
1 3 6 4 5 2 1 10 + 24 = 341 3 6 5 4 2 1 19 + 19 + 24 3 = 591 3 5 6 4 2 1 19 + 18 + 19 6 = 501 5 3 6 4 2 1 19 + 12 + 18 2 = 47
The shortest tour results when inserting 5 between thenodes2and 1in P4 (and adding the arc {4, 2}):2 5 1 3 6 4 2, and the length ofthis tour is 20 + 32 = 52.
Optimization Group 7
8/11/2019 WI4_12
9/33
Other Variants of Greedy Insertion for Symmetric TSP
The choice of the node to insert in the current path can be made in many different ways:
nearest node (as done in the example);
random node;
farthest node.
When solving the same STSP instance with cheapest insertion of the farthest node we obtain(starting at node 1again):
P1 = 15(length 12);
P2 = 415(length20);
P3 = 4135(length28);
P4 = 41325(length 27);
tour 6413256(length49).
The tour length is shorter than when inserting the nearest node! Suprising, but this often
happens. Can you understand why?
Optimization Group 8
8/11/2019 WI4_12
10/33
Local Search Heuristic
Local search can be more conveniently discussed when the problem has the form
minSN
{c(S) : g(S) = 0} ,
where g(S) 0 represents a measure for infeasibility ofS. E.g., the constraint v(S) k
can be represented by using g(S) =(k v(S))+.
For a local search heuristic we need:
a solution SN;
a local neighborhood Q(S)for each solutionSN;
a goal functionf(S), which can be eitherc(S) whenSis feasible, and infinite otherwise,
or a composite function of the form c(S) + g(S)(0).
Local search Heuristic: Choose an intial solutionS. Search for a solutionS Q(S)that
minimizesf(S). Iff(S) =f(S), stop. Then SH =S is locally optimal. Otherwise, set
S=S and repeat.
Optimization Group 9
8/11/2019 WI4_12
11/33
Local Search Heuristic (cont.)
Local search Heuristic: Choose an intial solutionS. Search for a solutionS Q(S)thatminimizesf(S). Iff(S) =f(S), stop. Then SH =S is locally optimal. Otherwise, setS=S and repeat.
x
x
xx
x
x xx
xxx
xx
xx
x
x
x xx
x
x
x
x x xxx
xxxx
x
x
x
x
xx
x
x
x
x
x xx
x
x
x
x
xx
x
x
x
xx
x
x
xx
x
x
xx x
xxx
x
x
x
x
xxx
x
x
x
x
x
x
x
x
x
xxxx
x
x
xx
xx
x xx
x
x
xx
x
x
xx
x
x
x
xx
xx
x
x
x
xx x
x
x
x
xx
x xx
x
x
x
x
xx
x
xxx
x
x
x
x
xx
x
x
x
xxx
xxx
x
x
xx
x
x
x
x
x
x
x
xx
x
x
x
x
x
xxx x
x
x x
x
x
xx xx
x
Q(S1)
Q(S2)
Q(Sk)
Appropriate choice of the neighborhoods depend on the problem structure. A simple choiceis just to add or remove one element to or from S. This neighborhood has O(n) elements.
Finding the best solution in the neighborhood can thus be done in O(n)time.
If all feasible sets have the same size, a useful neighborhood is obtained by replacing oneelement ofSby an element not in S. This requiresO(n2) time. In the case of the STSP
this leads to the well known 2-exchange heuristic.
Optimization Group 10
8/11/2019 WI4_12
12/33
2-Exchange Heuristic for STSP
9 2 8 12 119 7 19 10 322 7 29 18 6
8 19 29 24 312 10 18 24 1911 32 6 3 19
With the greedy insertion heuristic we found the tour 6 4 1 3 2 5 6 of length 49. A2-exchange removes two (nonadjacent, why?) edges. The two resulting pieces are connected by two other edges(this can be done in only one way, why?!). If this yields a better tour, we accept it and repeat, until we find a
so-called2-optimal tour.We have a 6-city problem. So a tour has 6 edges. There are 63
2 = 9 possible 2-exchanges. For each
2-exchange we give the two new edges and the increase of the tour length.
3
4
2
5
1
6
Current tour.Length49.
no. 2 deleted arcs 2 new arcs increment
1 {1, 3} , {2, 5} {1, 2} , {3, 5} 27 12 = 152 {1, 3} , {5, 6} {1, 5} , {3, 6} 18 21 =33 {1, 3} , {6, 4} {1, 6} , {3, 4} 40 5 = 354 {3, 2} , {5, 6} {3, 5} , {2, 6} 50 26 = 245 {3, 2} , {6, 4} {3, 6} , {2, 4} 25 10 = 156 {3, 2} , {4, 1} {3, 4} , {2, 1} 38 15 = 237 {2, 5} , {6, 4} {2, 6} , {5, 4} 56 13 = 438 {2, 5} , {4, 1} {2, 4} , {5, 1} 31 18 = 139 {5, 6} , {4, 1} {5, 4} , {6, 1} 35 27 = 8
3
4
2
5
1
6
New tour.Length46.
The second exchange gives an improvement: It deceases the length of the tour to zL = 46. The tour is1 5 2 3 64 1. One may verify that this tour is 2-optimal.
Optimization Group 11
8/11/2019 WI4_12
13/33
Improved Local Search Heuristics
How do we escape from a local minimum, and thus potentially do better than a local search
heuristic? This is the question addressed by two heuristics that we first discuss briefly:
Tabu search
Simulated annealing
After this we deal with
Genetic algorithms
Rather than working with individual solutions, genetic algorithms work with a finitepopulation
(set of solutions) S1, . . . , S k, and the population evolves (changes somewhat randomly) fromonegeneration(iteration) to the next.
Finally we discuss two special purpose heuristics for the STSP problem and some heuristics
for MIPs.
Optimization Group 12
T b S h
8/11/2019 WI4_12
14/33
Tabu Search
In order to escape from local minimaone has to accept so now and then a solution with a
worse value than the incumbent. Since the (old and better) incumbent may belong to theneighborhood of the new incumbent it may happen thatcyclingoccurs, i.e., that the algorithm
returns to the same solution every two or three steps: S0 S1 S0 S1 . . .. To avoid
cycling, certain solutions or moves are forbiddenortabu. Comparing the new solution with
all previous incumbents would require much memory space and may be very time consuming.Instead, atabu list of recent solutions, or solution modifications, is kept. A basic version of
the algorithm is
Step 1: Initialize a tabu list.Step 2: Get an initial solutionS.
Step 3: While the stopping criteria is not satisfied:
3.1: Choose a subset Q
(S)Q(S)of non-tabu solutions.3.2: Let S = argmin
f(T) : T Q(S)
.
3.3: Replace SbyS.
Step 4: On termination, the best solution found is the heuristic solution.
Optimization Group 13
T b S h ( t )
8/11/2019 WI4_12
15/33
Tabu Search (cont.)
Step 1: Initialize a tabu list.Step 2: Get an initial solutionS.
Step 3: While the stopping criteria is not satisfied:3.1: Choose a subsetQ(S) Q(S) of non-tabu solutions.3.2: LetS = argmin {f(T) : T Q(S)}.3.3: Replace SbyS.
Step 4: On termination, the best solution found is the heuristic solution.
The parameters specific to tabu search are
(i) The choice ofQ(S). IfQ(S) is small, one takes the whole neighborhood. Otherwise,
Q(S) can be a fixed number of neighbors ofS, chosen randomly or by some heuristicrule.
(ii) The tabu list consists of a small number tof most recent solutions (or modifications). If
t= 1ort= 2, it is not surprising that cycling is still common. The magic value t= 7
is often a good choice.
(iii) The stopping rule is often just a fixed number of iterations, or a certain number of
iterations without any improvement of the goal value of the best solution found.
Optimization Group 14
T b S h ( t )
8/11/2019 WI4_12
16/33
Tabu Search (cont.)
If the neighborhood consists of single element switches:
Q(S) ={T N : |T\ S|= 1} , SN,
then the tabu list might be a list of the last telements{i1, . . . , it}added to the incumbent
and a list of the last telements{j1, . . . , jt}removed from the incumbent. A neighbor T is
then tabu ifT =S\ {iq}or ifT =S {jq}for some q = 1, . . . , t. So, in forming the
new incumbent one is not allowed to modify the current incumbent Sby adding some iq or
by removing some jq.
When implementing tabu search, the performance can be improved by using common sense.
E.g., there is no justification to make a solution tabu if it is the best solution found by other
researchers. In other words, tabu search can be viewed as a search strategy that tries to take
advantage of the history of the search and the problem structure intelligently.
Optimization Group 15
Simulated Annealing
8/11/2019 WI4_12
17/33
Simulated Annealing
Simulation is less direct, and less intelligent. The basic idea is to choose a neighbor ran-
domly. The neighbor replaces the incumbent with probability 1if it is better, and with some(changing) probability(0, 1)if it has a worse value.
The probability for accepting a worse solution is taken proportional to the difference in goal
values. So, if the number of iterations is large enough, one can escape from every local
minimum. On the other hand, to guarantee convergence of the algorithm, the probability of
accepting worse solutions decreases over time. A more formal description is as follows:
Step 1: Get an initial solutionS.
Step 2: Get an initialtemperature T, and acooling ratio r (0< r 0, replace SbyS with probability e
T.
3.2: Set T rT. (Reduce the temperature.)Step 4: Return the best solution found, this is the heuristic solution.
Optimization Group 16
Simulated Annealing (cont )
8/11/2019 WI4_12
18/33
Simulated Annealing (cont.)
Step 1: Get an initial solutionS.Step 2: Get an initialtemperatureT, and acooling ratior (0< r 0, replace SbyS with probabilitye
T .3.2: Set T rT. (Reduce the temperature.)
Step 4: Return the best solution found, this is the heuristic solution.
Note that the probability of accepting worse solutions decreases in time, because the temper-atureTdecreases in time. Also, the probability decreases ifincreases, i.e. if the quality ofthe solution becomes worse.
Just as for other local search heuristics, one has to define an initial solution, a neighborhoodfor each solution and the value f(S)of a solution. The parameters specific for SA are
(i) The initial temperature T.
(ii) The cooling ration r.
(iii) The loop length L.
(iv) The definition of frozen, i.e., a stopping rule.
Optimization Group 17
Genetic Algorithms
8/11/2019 WI4_12
19/33
Genetic Algorithms
Rather than working with individual solutions, genetic algorithms work with a finitepopulation
(set of solutions) S1, . . . , S k, and the population evolves (changes somewhat randomly) fromonegeneration(iteration) to the next. An iteration consists of the following steps:
(i) Evaluation. Thefitnessof the individuals is evaluated.
(ii) Parent Selection. Certain pairs of solutions (parents) are selected based on their
fitness.
(iii) Crossover. Each pair of parents combines to produce one or two new solutions (off-spring).
(iv) Mutation. Some of the offspring solutions are randomly modified.
(v) Population Selection. Based on their fitness, a new population is selected replacingsome or all of the original population by an identical number of offspring solutions.
Each of the five steps is discussed in more detail in the book.
Optimization Group 18
Worst-case Analysis of (some) Heuristics
8/11/2019 WI4_12
20/33
Worst-case Analysis of (some) Heuristics
Integer Knapsack Heuristic
We consider the integer knapsack problem
z = max
nj=1
cjxj :n
j=1
ajxj b, x Zn
,
where aj Z+ for all j and b Z+. We suppose that the variables are ordered so thatcjaj
is non-increasing and, moreover, that aj b for all j. We consider the following simple
heuristic: take xH1 =
ba1
andxHj = 0forj 2, yielding the value z
H =c1
ba1
.
Theorem 1 z
H
1
2 z.
Proof: The solution of the linear relaxation is x1 = ba1
, and xj = 0 for j 2. This
provides the upper bound zLP = c1ba1 z. As a1 b, we have
ba1
1. Setting
ba1 =
ba1
+
ba1fwe have 0
ba1
f
8/11/2019 WI4_12
21/33
Eucledian STSP
We say that an STSP in graph G = (V, E) isEucledianif for any three edges e, f and gforming a triangle, one has the so-calledtriangle inequality: cecf+ cg.
Proposition 1 LetGbe a complete graph whose edge lengths satisfy the triangle inequality,and letG contain a subgraph H = (V, E) which is Eulerian. Then G contains a Hamiltoncycle of length at mostc(H) =
eEce.
Proof: Note that any Eulerian circuit inHpasses exactly once through every edge in E, andhence has length
eEce. Any such circuit Cpasses through all the nodes in V, probably
more than once through some or all of them. Using Cwe construct a Hamilton circuit asfollows. We choose a node v1V, and starting at that node we walk along the edges ofC
(in one of the two possible directions). The first new node that we encounter after leavingv1 is called v2. After leaving v2, the first node different from v1 and v2 is called v3, andso on. Proceeding in this way we get a sequencev1, v2 . . . , vn containing all the nodes inV. We claim that the tour T : v1 v2 . . . vn v1 has length at mosteEceFor this two observations are necessary. First, G contains the edges
vi, vi+1
for i = 1
to i = n1 and also the edge {vn, v1}, since G is a complete graph. So T is a tourindeed. Second, the nodes on Tpartition the Euler circuit C in n subsequent edge-disjointpieces. Each edge on the tourTis a shortcut for the corresponding piece, due to the triangle
inequality. Thus the length of the tour is at most equal to the length ofC.
Optimization Group 20
Geometric proof
8/11/2019 WI4_12
22/33
Geometric proof
We consider the piece of the Eulerian tour between the nodes vi and vi+1 of the Hamiltontour. It will be convenient to call the nodes on the corresponding piece of the Eulerain tour
1, 2, . . . , k. So 1 vi and k vi+1. The figure depicts the situation for k = 6. Thered edges are part of the Eucledian tour.
1
2
34 5
66
Using the triangle inequality we show that the shortcut 1 6 is shorter than the path123456:
c16 c15+ c56
c14+ c45+ c56
c13+ c34+ c45+ c56
c12+ c23+ c34+ c45+ c56.
Optimization Group 21
The Tree Heuristic for Eucledian STSP
8/11/2019 WI4_12
23/33
The Tree Heuristic for Eucledian STSP
Step 1: Find a minimum-length spanning tree T, with edge set ET and length zT =
eETce.Step 2: Double each edge in Tto form a connected Eulerian graph.
Step 3: Using the previous proposition, taking any Eulerian tour, construct a Hamilton circuit
of lengthzH.
Proposition 2 zH 2 z.
Proof: Since every tour consists of a spanning tree plus an edge, we have the lower bound
zT z. The length of the Eulerian tour is 2zT, by construction. The construction of theHamilton circuit guarantees that zH 2zT. Thus we have
zH
2
zT z zH.
Optimization Group 22
Example of the Tree Heuristic for Eucledian STSP
8/11/2019 WI4_12
24/33
p
Below the distances are given between the capitals of 11 of the 12 provinces in the Netherlands.
1 2 3 4 5 6 7 8 9 10 111. Amsterdam 0 92 162 57 184 87 132 207 175 40 103
2. Arnhem 92 0 132 116 157 63 154 151 200 59 66
3. Assen 162 132 0 214 25 195 68 283 315 159 694. Den Haag 57 116 214 0 236 104 182 162 124 61 151
5. Groningen 184 157 25 236 0 220 58 308 340 184 94
6. Den Bosch 87 63 195 104 220 0 215 123 141 53 1297. Leeuwarden 132 154 68 182 58 215 0 305 306 162 91
8. Maastricht 207 151 283 162 308 123 305 0 242 176 217
9. Middelburg 175 200 315 124 340 141 306 242 0 156 246
10. Utrecht 40 59 159 61 184 53 162 176 156 0 90
11. Zwolle 103 66 69 151 94 129 91 217 246 90 0
We want to find a minimum length Hamilton circuit along these cities.
Optimization Group 23
Example of the Tree Heuristic for Eucledian STSP
8/11/2019 WI4_12
25/33
p
We first find a minimal weight spanning tree. The minimal weight tree is shown in the graph.It has weight 25 + 40 + 53 + 57 + 58 + 59 + 66 + 69 + 123 + 124 = 674.
1
2
3
4
5
6
7
8
9
10
11
By doubling each edge in the tree the graph becomes Eulerian.An Eulerian tour is (e.g.) 1 4 9 4 1 10
6 8 6 10 2 11 3 5 7 5
3 11 2 10 1, which has length 1348. From
this we obtain the Hamilton circuit 1 4 9 10
6 8 2 11 3 5 7 1, with length
5 7 + 1 2 4 + 1 5 6 + 5 3 + 1 2 3 + 1 5 1 + 6 6 + 6 9 + 2 5 +
58 + 132 = 1014.
N.B. When applying 2-exchanges to the above tour the length
reduces to 1008. The farthest neighbor insertion heuristic,
when started at node 10, yields a tour of length 1032. When
applying 2-exchanges this length reduces to 990. This is opti-mal: 1 10 4 9 6 8 2 11 3
571. For the nearest neighborhood insertion heuristic,
starting at node 8, the length is 1045, which can be reduced
to 1032by2-exchanges.
Optimization Group 24
The Tree/Matching Heuristic for Eucledian STSP
8/11/2019 WI4_12
26/33
/
Step 1: Find a minimum-length spanning tree T, with edge set ET and length zT =
eET
ce.
Step 2: LetUbe the set of odd nodes in (V, ET). Find a perfect matchingMof minimum
length zM in the subgraph G = (U, E) ofG induced by the subset U, so E
contains all edges inGwith both end points in U. By construction, (V, ET M)
is an Eulerian graph.Step 3: From any Eulerian tour in(V, ET M), construct a Hamilton circuit of lengthz
C.
Proposition 3 (Christofides, 1976) zC 32 z.
Proof: As above, zT z. Let the nodes be ordered such that 1 2 . . . n 1
is an optimal tour (of length z). Let j1 < . . . < j2k be the nodes of U, and let ei =
ji, ji+1, withjn+1=j1. Due to the triangle inequality, one has2ki=1 eiz. The sets
M1 = {ei : i odd} and M2 = {ei : i even} are matchings in G
= (U, E). Hence
zM zM1 andzM zM2. Consequently, since zM1+ zM2 =2k
i=1 ei z, 2zM z.
Hence either zM1 1
2z, or zM2 1
2z. Suppose zM1 1
2z. Then (V, ET M1) is
Eulerian, and has weight 32z. As we have seen before, we can then construct a Hamiltonian
circuit of lengthzC
32z.
Optimization Group 25
Geometric proof
8/11/2019 WI4_12
27/33
We are given an optimal Hamilton circuit C : 1 2 . . . n 1 of length z in thegraphG= (V, E)and the setUof nodes having odd degree in a minimum weight spanning
tree T ofG. The nodes in U partition C in pieces, we connect these nodes by the edgesthat are shortcuts of these pieces, as indicated in the figure below. These edges form an evenlength circuit, because the number of nodes inUis even. We alternately assign the edges tothe sets M1 (red) andM2 (green). ThenM1 andM2 are matchings in the subgraph ofG
induced by U.
u1
u2
u3
u6
u5 u4
Obviously, due to the triangle inequality, zM1 +zM2 z, Hence either zM1 1
2z, orzM2
12z. By adding the shortest matching to Twe get an Eulerian graph whose Eulerian
circuits have length 12z.
Optimization Group 26
Example of the Tree/Matching Heuristic for Eucledian STSP
8/11/2019 WI4_12
28/33
We use the minimal weight spanning tree that we found earlier.
9
4
1
6
10
11
7
8
2
3
5
The odd degree nodes in the tree form the set {7, 8, 9, 10}. The
indiced subgraph on these nodes is depicted below:
9
10
7
8
305306 162
242
176217
The mimimal weight matching consists of the arcs {8, 9} and
{7, 10}. Adding these arcs to the tree the graph becomes Eu-
lerian. An Eulerian tour is (e.g.) 1 4 9 8 6
10 2 11 3 5 7 10 1. From this we
obtain the Hamilton circuit 1 4 9 8 6 10 2113571, with length 57 + 124 + 242 +
123+53+59+66+69+25+58+132 = 1008. This
route is2-optimal. Note that we know that a minimum length tour
can not be shorter than 23 1008 = 672.
Optimization Group 27
MIP-based Heuristics
8/11/2019 WI4_12
29/33
Dive-and-Fix
Aim: find quickly a feasible solution, which gives a tight lower bound in the search tree.
Suppose we have a mixed0-1problem in variablesxi R andyj {0, 1}. Given a solution
(x, y)of the linear relaxation, let
F = j : yj
/ {
0, 1}
.
Initialization Take the solution(x, y)of the linear relaxation at some node in the search
tree.
Basic Iteration As long as F =, do the following: Let i = argminjF
min
yj , 1 y
j
(find the fractional variable
closest to integer). Ifyi
8/11/2019 WI4_12
30/33
We discuss two other heuristics for mixed integer problems that are hard to solve. For simplicity
we describe the heuristic for an IP of the following form.
z = max
c1
Tx1 + c2
Tx2 : A1x1 + A2x2 =b, x1 Z
n1+ , x
2 Zn2+
.
It is supposed that the variables x1j forj N1 are more important than the variables x2j for
j N2, where |Ni|=ni fori= 1, 2.
The idea is to solve two (or more) easier LO problems or MIPs. The first one allows us to fix
or limit the range of the more important x1 variables, whereas the second allows us choose
good values for the x1 variables.
Optimization Group 29
Relax-and-Fix
8/11/2019 WI4_12
31/33
z = max
c1
Tx1 + c2
Tx2 : A1x1 + A2x2 =b, x1 Zn1+, x
2 Zn2+
.
Relax: Solve the relaxation
z = max
c1
Tx1 + c2
Tx2 : A1x1 + A2x2 =b, x1 Z
n1+ , x
2 Rn2+
.
Let x1, x2be a solution of this problem.Fix: Fix the (important)x1 variables to their value in x1 and solve the restriction
z = max
c1
Tx1 + c2
Tx2 : A2x2 =b A1x1, x1 = x1, x2 Z
n2+
.
If this problem is infeasible, the heuristic fails. Otherwise, let
x1, x2
be a solution
of this problem.
Heuristic: Use as heuristic solutionxH = x1, x2, whose value satisfiesz =c
TxH z
z.
Optimization Group 30
Cut-and-Fix
8/11/2019 WI4_12
32/33
z = max
c1T
x1 + c2T
x2 : A1x1 + A2x2 =b, x1 Zn1+, x2 Zn2+
.
We assume that a strong cutting plane algorithm is available, which generates strong cuts C1x1 + C2x2 ,so after adding these cuts, at least some of the integer variables take values close to integer or to their optimalvalues.
Cut: Using a strong cutting plane algorithm find a solution of the tight linear relaxation
z = max c1Tx1 + c2Tx2 : A1x1 + A2x2 =b, C1x1 + C2x2 , x1 Rn1+, x2 Rn2+ .Let
x1, x2
be a solution of this problem. N.B. The cuts are generated in the course of the
algorithm solving the linear relaxation of the given problem.
Fix orBound
: Choose. Forj N1, setj = x1j + and uj = x
1j . Solve the restriction
z = max c1T
x1 + c2T
x2 :
A1x1 + A2x2 =b, C1x1 + C2x2 , x1 u, x1 Zn1+, x2 Zn2+.
If this problem is infeasible, the heuristic fails: one may try again with decreased. Otherwise, letx
1
, x2
be a solution of this problem.
Heuristic: Use as heuristic solution xH =
x1, x2
, whose value satisfies z =cTxH z z.
Observe that if is small and positive, x1 variables taking values within of integer in the linear relaxation arefixed in the restricted problem, while others are forced to either the value x1j or the value x1j . On the otherhand, ifis negative, all the x1 variables can still take two values in the restricted problem.
Optimization Group 31
More Courses on Optmization
8/11/2019 WI4_12
33/33
Code Name DocentWI3 031 Niet-Lineaire Optimalisering C. Roos
WI4 051TU Introduction to OR H. van Maaren
WI4 060 Optimization and Engineering C. Roos
WI4 062TU Transportation, Routing and Scheduling Problems C. Roos
WI4 063TU Network Models and Algorithms J.B.M. Melissen
WI4 064 Discrete Optimization C. Roos
WI4 087TU Optimization, Models and Algorithms H. van MaarenWI4 131 Discrete and Continuous Optimization G.J. Olsder/C. Roos
IN4 082 Local (Heuristic) Search Methods H. van Maaren
IN4 077 Computational Logic and Satisfiability H. van Maaren/C. WitteveenIN4 081 Randomized algorithms H. van Maaren
Optimization Group 32