Top Banner
Unifying Local and Exhaustive Search John Hooker Carnegie Mellon University September 2005
78

Unifying Local and Exhaustive Search

Jan 21, 2016

Download

Documents

Unifying Local and Exhaustive Search. John Hooker Carnegie Mellon University September 2005. Exhaustive vs. Local Search. They are generally regarded as very different. Exhaustive methods examine every possible solution, at least implicitly. Branch and bound, Benders decomposition. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Unifying Local and Exhaustive Search

Unifying Local and Exhaustive Search

John Hooker

Carnegie Mellon University

September 2005

Page 2: Unifying Local and Exhaustive Search

Exhaustive vs. Local Search

• They are generally regarded as very different.• Exhaustive methods examine every possible

solution, at least implicitly.– Branch and bound, Benders decomposition.

• Local search methods typically examine only a portion of the solution space.– Simulated annealing, tabu search, genetic algorithms,

GRASP (greedy randomized adaptive search procedure).

Page 3: Unifying Local and Exhaustive Search

Exhaustive vs. Local Search

• However, exhaustive and local search are often closely related.

• “Heuristic algorithm” = “search algorithm”– “Heuristic” is from the Greek ‘ευριςτέιν (to search,

to find).

• Two classes of exhaustive search methods are very similar to corresponding local search methods:– Branching methods.– Nogood-based search.

Page 4: Unifying Local and Exhaustive Search

Type of search Exhaustive search examples

Local search examples

Branching Branch and bound DPL for SAT

Simulated annealing GRASP

Nogood-based Benders decomposition DPL with clause learning Partial order dynamic backtracking

Tabu search

Page 5: Unifying Local and Exhaustive Search

Why Unify Exhaustive & Local Search?

• Encourages design of algorithms that have several exhaustive and inexhaustive options.– Can move from exhaustive to inexhaustive options as

problem size increases.

• Suggests how techniques used in exhaustive search can carry over to local search.– And vice-versa.

Page 6: Unifying Local and Exhaustive Search

Why Unify Exhaustive & Local Search?

• We will use an example (traveling salesman problem with time windows) to show:– Exhaustive branching can suggest a generalization of

a local search method (GRASP).• The bounding mechanism in branch and bound also

carries over to generalized GRASP.

– Exhaustive nogood-based search can suggest a generalization of a local search method (tabu search).

Page 7: Unifying Local and Exhaustive Search

Outline

• Branching search.– Generic algorithm (exhaustive & inexhaustive)– Exhaustive example: Branch and bound– Inexhaustive examples: Simulated annealing, GRASP.– Solving TSP with time windows

• Using exhaustive branching & generalized GRASP.

• Nogood-based search.– Generic algorithm (exhaustive & inexhaustive)– Exhaustive example: Benders decomposition– Inexhaustive example: Tabu search– Solving TSP with time windows

• Using exhaustive nogood-based search and generalized tabu search.

Page 8: Unifying Local and Exhaustive Search

Branching Search

• Each node of the branching tree corresponds to a restriction P of the original problem.– Restriction = constraints are added.

• Branch by generating restrictions of P.– Add a new leaf node for each restriction.

• Keep branching until problem is “easy” to solve.• Notation:

– feas(P) = feasible set of P– relax(P) = a relaxation of P

Page 9: Unifying Local and Exhaustive Search

Branching Search Algorithm

• Repeat while leaf nodes remain:– Select a problem P at a leaf node.– If P is “easy” to solve then

• If solution of P is better than previous best solution, save it.• Remove P from tree.

– Else• If optimal value of relax(P) is better than previous best solution, then

– If solution of relax(P) is feasible for P then P is “easy”; save the solution and remove P from tree

– Else branch.• Else

– Remove P from tree

Page 10: Unifying Local and Exhaustive Search

Branching Search Algorithm

• To branch:– If set restrictions P1, …, Pk of P so far generated is complete,

then• Remove P from tree.

– Else• Generate new restrictions Pk+1, …, Pm and leaf nodes for them.

Page 11: Unifying Local and Exhaustive Search

Branching Search Algorithm

• To branch:– If set restrictions P1, …, Pk of P so far generated is complete,

then• Remove P from tree.

– Else• Generate new restrictions Pk+1, …, Pm and leaf nodes for them.

• Exhaustive vs. heuristic algorithm – In exhaustive search, “complete” = exhaustive

– In a heuristic algorithm, “complete” exhaustive

i

i PfeasPfeas )()(

Page 12: Unifying Local and Exhaustive Search

Original problem

P Leaf node

Currently at this leaf node

Previously removed nodes

Exhaustive search:Branch and bound Every restriction P is

initially too “hard” to solve.

So, solve LP relaxation.If LP solution is feasible for P, then P is “easy.”

Page 13: Unifying Local and Exhaustive Search

Original problem

P Leaf node

P1 P2 Pk

....

Currently at this leaf node

Previously removed nodes

Previously removed nodes

Exhaustive search:Branch and bound Every restriction P is

initially too “hard” to solve.

So, solve LP relaxation.If LP solution is feasible for P, then P is “easy.”

Page 14: Unifying Local and Exhaustive Search

Original problem

P Leaf node

P1 P2 Pk

....

Currently at this leaf node

Previously removed nodes

Previously removed nodes

Create more branches if value of relax(P) is better than previous

solution and P1, …, P2 are not exhaustive

Exhaustive search:Branch and bound Every restriction P is

initially too “hard” to solve.

So, solve LP relaxation.If LP solution is feasible for P, then P is “easy.”

Page 15: Unifying Local and Exhaustive Search

Algorithm RestrictionP

P easy enough to solve?

relax(P)

Branch and bound(exhaustive)

Created by splitting variable domain

Never. Always solve relax(P)

LP relaxation

Simulated annealing

(inexhaustive)

GRASP(inexhaustive)

Greedy phase

Local search phase

Page 16: Unifying Local and Exhaustive Search

Original problem

PP1P2 Pk

solution = x

....Currently at this leaf

node, which was generated because

{P1, …, Pk} is not complete

Previously removed nodes

Heuristic algorithm:Simulated annealing

Search tree has 2 levels.

Second level problems are always “easy” to solve by searching

neighborhood of previous solution.

Page 17: Unifying Local and Exhaustive Search

Original problem

PP1P2 Pk

solution = x

....Currently at this leaf

node, which was generated because

{P1, …, Pk} is not complete

Previously removed nodes

Heuristic algorithm:Simulated annealing

feas(P) = neighborhood of x

Randomly select y feas(P)

Solution of P = y if y is better than x; otherwise, y with probability p, x with probability 1 p.

Search tree has 2 levels.

Second level problems are always “easy” to solve by searching

neighborhood of previous solution.

Page 18: Unifying Local and Exhaustive Search

Algorithm RestrictionP

P easy enough to solve?

relax(P)

Branch and bound(exhaustive)

Created by splitting variable domain

Never. Always solve relax(P)

LP relaxation

Simulated annealing

(inexhaustive)

Created by defining neighborhood of previous solution

Always. Examine random element of

neighborhood.

Not used.

GRASP(inexhaustive)

Greedy phase

Local search phase

Page 19: Unifying Local and Exhaustive Search

Original problem

Heuristic algorithm:GRASP

Greedy randomized adaptive search procedure

x1 = v1

x2 = v2

x3 = v3

x3 = v3

Greedy phase: select

randomized greedy values

until all variables fixed.

“Easy” to solve. Solution = v.

“Hard” to solve.relax(P) contains no

constraints

P

Page 20: Unifying Local and Exhaustive Search

Original problem

PP1 P2Pk

....

feas(P2) = neighborhood of v

Heuristic algorithm:GRASP

Greedy randomized adaptive search procedure

Local search phase

x1 = v1

x2 = v2

x3 = v3

x3 = v3

Greedy phase: select

randomized greedy values

until all variables fixed.

“Easy” to solve. Solution = v.

“Hard” to solve.relax(P) contains no

constraints

P

Page 21: Unifying Local and Exhaustive Search

Original problem

PP1 P2Pk

....

Stop local search when “complete” set

of neighborhoods have been searched. Process now starts

over with new greedy phase.

feas(P2) = neighborhood of v

Heuristic algorithm:GRASP

Greedy randomized adaptive search procedure

Local search phase

x1 = v1

x2 = v2

x3 = v3

x3 = v3

Greedy phase: select

randomized greedy values

until all variables fixed.

“Easy” to solve. Solution = v.

“Hard” to solve.relax(P) contains no

constraints

P

Page 22: Unifying Local and Exhaustive Search

Algorithm RestrictionP

P easy enough to solve?

relax(P)

Branch and bound(exhaustive)

Created by splitting variable domain

Never. Always solve relax(P)

LP relaxation

Simulated annealing

(inexhaustive)

Created by defining neighborhood of previous solution

Always. Examine random element of

neighborhood.

Not used.

GRASP(inexhaustive)

Greedy phase

Local search phase

Created by assigning next

variable a randomized greedy

value.

Created by defining neighborhood of

previous solution.

Not until all variables are

assigned values.

Always. Search neighborhood.

No constraints, so solution is always

infeasible and branching necessary.

Not used.

Page 23: Unifying Local and Exhaustive Search

• A salesman must visit several cities.• Find a minimum-length tour that visits each city

exactly once and returns to home base.• Each city must be visited within a time window.

An Example

TSP with Time Windows

Page 24: Unifying Local and Exhaustive Search

A B

E C

D

5

8

75

7 4

3 5

6

6

[20,35]

[15,25]

[10,30]

[25,35]

Home base

An Example

TSP with Time Windows

Timewindow

Page 25: Unifying Local and Exhaustive Search

Relaxation of TSP

Suppose that customers x0, x1, …, xk have been visited so far.

Let tij = travel time from customer i to j .

Then total travel time of completed route is bounded below by

}{min}{min,min 0,,

,,},,,{ 0

00

jxxj

xxjij

xxjijx tttT

kk

kk

Earliest time vehicle can leave

customer kMin time from customer j ’s

predecessor to j

Min time from last customer back to

home

Page 26: Unifying Local and Exhaustive Search

}{min}{min,min 0,,

,,},,,{ 0

00

jxxj

xxjij

xxjijx tttT

kk

kk

Earliest time vehicle can leave

customer kMin time from customer j ’s

predecessor to j

Min time from last customer back to

home

x0

x1

x2

xk j

Page 27: Unifying Local and Exhaustive Search

A A

Exhaustive Branch-and-Bound

Sequence of customers

visited

Page 28: Unifying Local and Exhaustive Search

A A

AD A

Exhaustive Branch-and-Bound

AC AAB AAE A

Page 29: Unifying Local and Exhaustive Search

A A

AD A

ADC A

Exhaustive Branch-and-Bound

ADB A ADE A

AC AAB AAE A

Page 30: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 36

Exhaustive Branch-and-Bound

ADB A ADE A

AC AAB AAE A

Page 31: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 36

Exhaustive Branch-and-Bound

ADCEBAFeasible

Value = 34

ADB A ADE A

AC AAB AAE A

Page 32: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 36

Exhaustive Branch-and-Bound

ADB ARelaxation value = 36

Prune

ADCEBAFeasible

Value = 34

ADE A

AC AAB AAE A

Page 33: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 36

Exhaustive Branch-and-Bound

ADB ARelaxation value = 36

Prune

ADCEBAFeasible

Value = 34

ADE ARelaxation value = 40

Prune

AC AAB AAE A

Page 34: Unifying Local and Exhaustive Search

A A

AD A

AC ARelaxation value = 31

ADC A

ADCBEAFeasible

Value = 36

Exhaustive Branch-and-Bound

ADB ARelaxation value = 36

Prune

ADCEBAFeasible

Value = 34

ADE ARelaxation value = 40

Prune Continue in this fashion

AB AAE A

Page 35: Unifying Local and Exhaustive Search

Exhaustive Branch-and-Bound

Optimal solution

Page 36: Unifying Local and Exhaustive Search

Algorithm RestrictionP

P easy enough to solve?

relax(P)

Branch and boundfor TSPTW(exhaustive)

Created by fixing next variable

Never. Always solve relax(P).

TSPTW relaxation

Generalized GRASP for TSPTW

(inexhaustive)Greedy phase

Local search phase

Page 37: Unifying Local and Exhaustive Search

A A

Generalized GRASP

Basically,

GRASP = greedy solution +

local search

Begin with greedy assignments that can be viewed as

creating “branches”

Sequence of customers

visited

Page 38: Unifying Local and Exhaustive Search

Basically,

GRASP = greedy solution +

local search

Begin with greedy assignments that can be viewed as

creating “branches”

A A

AD A

Generalized GRASP

Visit customer than can be served earliest from A

Greedy phase

Page 39: Unifying Local and Exhaustive Search

A A

AD A

ADC A

Generalized GRASP

Next, visit customer than can be served earliest

from D

Basically,

GRASP = greedy solution +

local search

Begin with greedy assignments that can be viewed as

creating “branches”

Greedy phase

Page 40: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 34

Generalized GRASP

Continue until all customers are

visited.

This solution is feasible. Save it.

Basically,

GRASP = greedy solution +

local search

Begin with greedy assignments that can be viewed as

creating “branches”

Greedy phase

Page 41: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 34

Generalized GRASP

Backtrack randomly

Local search phase Basically,

GRASP = greedy solution +

local search

Begin with greedy assignments that can be viewed as

creating “branches”

Page 42: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 34

Generalized GRASP

Delete subtree already

traversed

Local search phase Basically,

GRASP = greedy solution +

local search

Begin with greedy assignments that can be viewed as

creating “branches”

Page 43: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADE A

ADCBEAFeasible

Value = 34

Generalized GRASP

Randomly select partial

solution in neighborhood

of current node

Local search phase Basically,

GRASP = greedy solution +

local search

Begin with greedy assignments that can be viewed as

creating “branches”

Page 44: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADE A

ADCBEAFeasible

Value = 34

ADEBCAInfeasible

Generalized GRASP

Complete solution in

greedy fashion

Greedy phase

Page 45: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADE A

ADCBEAFeasible

Value = 34

ADEBCAInfeasible

Generalized GRASP

Randomly backtrack

Local search phase

Page 46: Unifying Local and Exhaustive Search

A A

AD A

AB A

ADC A

ADE A ABD A

ADCBEAFeasible

Value = 34

ADEBCAInfeasible

ABDECAInfeasible

Generalized GRASP

Continue in similar fashion

Exhaustive search algorithm (branching search) suggests a generalization of a heuristic algorithm (GRASP).

Page 47: Unifying Local and Exhaustive Search

A A

AD A

Generalized GRASP with relaxation

Greedy phase Exhaustive search

suggests an improvement on a

heuristic algorithm: use relaxation bounds to reduce the search.

Page 48: Unifying Local and Exhaustive Search

A A

AD A

ADC A

Greedy phase

Generalized GRASP with relaxation

Page 49: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 34

Greedy phase

Generalized GRASP with relaxation

Page 50: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 34

Backtrack randomly

Local search phase

Generalized GRASP with relaxation

Page 51: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADE ARelaxation value = 40

Prune

ADCBEAFeasible

Value = 34

Generalized GRASP

Local search phase

Page 52: Unifying Local and Exhaustive Search

A A

AD A

ADC A

ADCBEAFeasible

Value = 34

Generalized GRASP

Randomly backtrack

ADE ARelaxation value = 40

Prune

Local search phase

Page 53: Unifying Local and Exhaustive Search

A A

AD A

AB ARelaxation value = 38

Prune

ADC A

ADCBEAFeasible

Value = 34

Generalized GRASP

ADE ARelaxation value = 40

Prune

Local search phase

Page 54: Unifying Local and Exhaustive Search

Algorithm RestrictionP

P easy enough to solve?

relax(P)

Branch and boundfor TSPTW(exhaustive)

Created by fixing next variable

Never. Always solve relax(P).

TSPTW relaxation

Generalized GRASP for TSPTW

(inexhaustive)Greedy phase

Local search phase

Created by assigning greedy

value to next variable.

Randomly backtrack to

previous node.

Never. Always solve relax(P).

Yes. Randomly select value for next

variable.

TSPTW relaxation.

Not used.

Page 55: Unifying Local and Exhaustive Search

Nogood-Based Search

• Search is directed by nogoods.– Nogood = constraint that excludes solutions already

examined (explicitly or implicitly).

• Next solution examined is solution of current nogood set.– Nogoods may be processed so that nogood set is easy to

solve.

• Search stops when nogood set is complete in some sense.

Page 56: Unifying Local and Exhaustive Search

Nogood-Based Search Algorithm

• Let N be the set of nogoods, initially empty.• Repeat while N is incomplete:

– Select a restriction P of the original problem.– Select a solution x of relax(P) N.– If x is feasible for P then

• If x is the best solution so far, keep it.

• Add to N a nogood that excludes x and perhaps other solutions that are no better.

– Else add to N a nogood that excludes x and perhaps other solutions that are infeasible.

– Process nogoods in N.

Page 57: Unifying Local and Exhaustive Search

Nogood-Based Search Algorithm

• To process the nogood set N:– Infer new nogoods from existing ones.

• Delete (redundant) nogoods if desired.

– Goal: make it easy to find feasible solution of N.

Page 58: Unifying Local and Exhaustive Search

Nogood-Based Search Algorithm

• To process the nogood set N:– Infer new nogoods from existing ones.

• Delete (redundant) nogoods if desired.

– Goal: make it easy to find feasible solution of N.

• Exhaustive vs. heuristic algorithm.– In an exhaustive search, complete = infeasible.– In a heuristic algorithm, complete = large enough.

Page 59: Unifying Local and Exhaustive Search

Start with N = {v > }

Let (v*,x*) minimize v subject to N

(master problem)

Let y* minimize cy subject to Ay b g(x*).(subproblem)

Master problem feasible?(x*,y*)

is solutionno

Add nogoods v u(b g(y)) + f(y) (Benders cut)

and v < f(x*) + cy* to N,where u is dual solution of subproblem

Minimize f(x) + cy subject to g(x) + Ay b.

N = master problem constraints.relax(P) =

Nogoods are Benders cuts. They are not processed.

N is complete when infeasible.

Exhaustive search:Benders

decomposition

yes

Page 60: Unifying Local and Exhaustive Search

Exhaustive search:Benders

decomposition

Select optimal solution of N.

Formally, selected solution is (x*,y) where y is arbitrary

Minimize f(x) + cy subject to g(x) + Ay b.

N = master problem constraints.relax(P) =

Nogoods are Benders cuts. They are not processed.

N is complete when infeasible.

Start with N = {v > }

Let (v*,x*) minimize v subject to N

(master problem)

Let y* minimize cy subject to Ay b g(x*).(subproblem)

Master problem feasible?(x*,y*)

is solutionno

Add nogoods v u(b g(y)) + f(y) (Benders cut)

and v < f(x*) + cy* to N,where u is dual solution of subproblem

yes

Page 61: Unifying Local and Exhaustive Search

Exhaustive search:Benders

decomposition

Subproblem generates nogoods.

Select optimal solution of N.

Formally, selected solution is (x*,y) where y is arbitrary

Minimize f(x) + cy subject to g(x) + Ay b.

N = master problem constraints.relax(P) =

Nogoods are Benders cuts. They are not processed.

N is complete when infeasible.

Start with N = {v > }

Let (v*,x*) minimize v subject to N

(master problem)

Let y* minimize cy subject to Ay b g(x*).(subproblem)

Master problem feasible?(x*,y*)

is solutionno

Add nogoods v u(b g(y)) + f(y) (Benders cut)

and v < f(x*) + cy* to N,where u is dual solution of subproblem

yes

Page 62: Unifying Local and Exhaustive Search

Algorithm RestrictionP

Relax(P) Nogoods Nogood processing

Solution of relax(P) N

Benders decomposition(exhaustive)

Same as original

problem.

No constraints

Benders cuts,

obtained by solving

subproblem

None Optimal solution of master problem +

arbitrary values for subproblem

variables

Tabu search(inexhaustive)

Partial-order dynamic

backtracking for TSPTW

(exhaustive)

Partial-order dynamic

backtracking for TSPTW

(inexhaustive)

Page 63: Unifying Local and Exhaustive Search

Nogood-Based Search Algorithm

• Other forms of exhaustive nogood-based search:– Davis-Putnam-Loveland method for with clause learning (for

propositional satisfiability problem).– Partial-order dynamic backtracking.

Page 64: Unifying Local and Exhaustive Search

Start with N =

Let feasible set of P be neighborhood of x*.

Let x* be best solution of P N

N “complete”? Stopyes

Add nogood x x* to N.Process N by removing old

nogoods.

The nogood set N is the tabu list.

In each iteration, search neighborhood of current

solution for best solution not on tabu list.

N is “complete” when one has searched long enough.

Heuristic algorithm:Tabu search

no

Page 65: Unifying Local and Exhaustive Search

Start with N =

Let feasible set of P be neighborhood of x*.

Let x* be best solution of P N

N “complete”? Stopyes

Add nogood x x* to N.Process N by removing old

nogoods.

The nogood set N is the tabu list.

In each iteration, search neighborhood of current

solution for best solution not on tabu list.

N is “complete” when one has searched long enough.

Heuristic algorithm:Tabu search

no

Neighborhood of current solution x* is feas(relax(P)).

Page 66: Unifying Local and Exhaustive Search

Start with N =

Let feasible set of P be neighborhood of x*.

Let x* be best solution of P N

N “complete”? Stopyes

Add nogood x x* to N.Process N by removing old

nogoods.

The nogood set N is the tabu list.

In each iteration, search neighborhood of current

solution for best solution not on tabu list.

N is “complete” when one has searched long enough.

Heuristic algorithm:Tabu search

no

Solve P N by searching neighborhood.

Remove old nogoods from tabu list.

Neighborhood of current solution x* is feas(relax(P)).

Page 67: Unifying Local and Exhaustive Search

Algorithm RestrictionP

Relax(P) Nogoods Nogood processing

Solution of relax(P) N

Benders decomposition(exhaustive)

Same as original

problem.

No constraints

Benders cuts,

obtained by solving

subproblem

None Optimal solution of master problem +

arbitrary values for subproblem

variables

Tabu search(inexhaustive)

Created by defining

neighborhood of previous

solution

Same as P Tabu list None Find best solution in

neighborhood not on tabu list.

Partial-order dynamic

backtracking for TSPTW

(exhaustive)

Partial-order dynamic

backtracking for TSPTW

(inexhaustive)

Page 68: Unifying Local and Exhaustive Search

A B

E C

D

5

8

75

7 4

3 5

6

6

[20,35]

[15,25]

[10,30]

[25,35]

Home base

An Example

TSP with Time Windows

Timewindow

Page 69: Unifying Local and Exhaustive Search

ADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

Excludes current solution by excluding any solution

that begins ADCB

Exhaustive nogood-based search

Current nogoods

In this problem, P is original problem, and relax(P) has no constraints.

So relax(P) N = N

This is a special case of partial-order dynamic backtracking.

Page 70: Unifying Local and Exhaustive Search

ADC2

ADCE34ADCEBAADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

The current nogoods ADCB, ADCE rule out any solution beginning ADC.

So process the nogood set by replacing ADCB, ADCE with their parallel resolvent ADC.

This makes it possible to solve the nogood set with a greedy algorithm.

Exhaustive nogood-based search

Greedy solution of current nogood set: Go to closest customer consistent with nogoods.

Page 71: Unifying Local and Exhaustive Search

ADCADB,3

ADBinfeasibleADBEACADC2

ADCE34ADCEBAADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

Exhaustive nogood-based search

Not only is ADBEAC infeasible, but we observe that no solution beginning ADB can be completed within time windows.

Page 72: Unifying Local and Exhaustive Search

AD4

ADEinfeasibleADEBCAADCADB,3

ADBinfeasibleADBEACADC2

ADCE34ADCEBAADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

Exhaustive nogood-based search

Process nogoods ADB, ADC, ADE to obtain parallel resolvent AD

Page 73: Unifying Local and Exhaustive Search

A14

AECAEB,infeasibleAEDBCAAECAEB,AD,AC,AB,13

AECAEB,infeasibleAEBCDAADAC,AB,12

ABEinfeasibleABEDCAADAC,ABC,ABD,11

ABCABD,infeasibleABDECAADAC,10

ACED40ACEDBAADACEB,ACB,9

ACEBinfeasibleACEBDAADACB,8

ACBD40ACBDEAADACBE,ACD,7

ACBEinfeasibleACBEDAADACD,6

ACDE36ACDBEAADACDB,5

ACDB38ACDBEAAD4

ADEinfeasibleADEBCAADCADB,3

ADBinfeasibleADBEACADC2

ADCE34ADCEBAADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

Exhaustive nogood-based search

At the end of the search,

the processed nogood set rules out all

solutions (i.e, is infeasible).

Optimal solution.

Page 74: Unifying Local and Exhaustive Search

Algorithm RestrictionP

Relax(P) Nogoods Nogood processing

Solution of relax(P) N

Benders decomposition(exhaustive)

Same as original

problem.

No constraints

Benders cuts,

obtained by solving

subproblem

None Optimal solution of master problem +

arbitrary values for subproblem

variables

Tabu search(inexhaustive)

Created by defining

neighborhood of previous

solution

Same as P Tabu list Delete old nogoods.

Find best solution in

neighborhood not on tabu list.

Partial-order dynamic

backtracking for TSPTW

(exhaustive)

Same as original problem

No constraints

Rule out last

sequence tried

Parallel resolution

Greedy solution.

Partial-order dynamic

backtracking for TSPTW

(inexhaustive)

Page 75: Unifying Local and Exhaustive Search

Start as before.

Remove old nogoods from nogood set. So method is inexhaustive

Generate stronger nogoods by ruling out subsequences other than those starting with A.

This requires more intensive processing (full resolution), which is possible because nogood set is small.

Inexhaustive nogood-based search

ADC2

ADCE34ADCEBAADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

Page 76: Unifying Local and Exhaustive Search

ADCADB,ABEC,3

ECADB,infeasibleADBECAADC2

ADCE34ADCEBAADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

Process nogood set: List all subsequences beginning with A that are ruled out by current nogoods.

This requires a full resolution algorithm.

Inexhaustive nogood-based search

Page 77: Unifying Local and Exhaustive Search

ACBD38ACDBEAAEBAD,ACEB,ABEC,4

EBADE,infeasibleADEBCAADCADB,ABEC,3

ECADB,infeasibleADBECAADC2

ADCE34ADCEBAADCB1

ADCB36ADCBEA0

nogoods New valueSol. ofSolution )(Iter. NNPrelax

Continue in this fashion, but start dropping old nogoods.

Adjust length of nogood list is avoid cycling, as in tabu search.

Stopping point is arbitrary.

So exhaustive nogood-based search suggests a more sophisticated variation of tabu search.

Inexhaustive nogood-based search

Page 78: Unifying Local and Exhaustive Search

Algorithm RestrictionP

Relax(P) Nogoods Nogood processing

Solution of relax(P) N

Benders decomposition(exhaustive)

Same as original

problem.

No constraints

Benders cuts,

obtained by solving

subproblem

None Optimal solution of master problem +

arbitrary values for subproblem

variables

Tabu search(inexhaustive)

Created by defining

neighborhood of previous

solution

Same as P Tabu list Delete old nogoods.

Find best solution in

neighborhood not on tabu list.

Partial-order dynamic

backtracking for TSPTW

(exhaustive)

Same as original problem

No constraints

Rule out last

sequence tried

Parallel resolution

Greedy solution.

Partial-order dynamic

backtracking for TSPTW

(inexhaustive)

Same as original problem

No constraints

Rule out last seq. & infeasible

sub-sequences

Full resolution, delete old nogoods.

Greedy solution.