Top Banner
1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def n Matrix View of Linear Programming Hill Climbing Simplex Method Dual Solution Witness to Optimalit Define Dual Problem Buy Fruit or Sell Vitamines Dualit Primal-Dual Hill Climbing
63

1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Dec 21, 2015

Download

Documents

Wendy Singleton
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

1

Linear Programming

Jeff Edmonds York UniversityCOSC 3101Lecture 5

• Def and Hot Dog Example• Network Flow Defn

• Matrix View of Linear Programming• Hill Climbing Simplex Method• Dual Solution Witness to Optimality• Define Dual Problem• Buy Fruit or Sell Vitamines Duality• Primal-Dual Hill Climbing

Page 2: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

2

Linear Programming• Linear Program: An

optimization problem whose constraints and cost function are linear functions

• Goal: Find a solution which optimizes the cost.

E.g.Maximize Cost Function : 21x1 - 6x2 – 100x3 - 100x4

Constraint Functions:5x1 + 2x2 +31x3 - 20x4 211x1 - 4x2 +3x3 + 10x1 ³ 566x1 + 60x2 - 31x3 - 15x4 200…..

Applied in various industrial fields: Manufacturing, Supply-Chain, Logistics, Marketing…

To save money and increase profits !

Page 3: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

3

A Hotdog

A combination of pork, grain, and sawdust, …

Page 4: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

4

Constraints: • Amount of moisture• Amount of protein,• …

Page 5: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

5

The Hotdog Problem

Given today’s prices,what is a fast algorithm to find the cheapest hotdog?

Page 6: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

6

Abstract Out Essential Details

Cost: 29, 8, 1, 2

Amount to add: x1, x2, x3, x4

pork

grai

nw

ater

saw

dust

3x1 + 4x2 – 7x3 + 8x4 ³ 122x1 - 8x2 + 4x3 - 3x4 ³ 24

-8x1 + 2x2 – 3x3 - 9x4 ³ 8x1 + 2x2 + 9x3 - 3x4 ³ 31

Constraints: • moisture• protein,• …

29x1 + 8x2 + 1x3 + 2x4Cost of Hotdog:

Page 7: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

7

3x1 + 4x2 – 7x3 + 8x4 ³ 122x1 - 8x2 + 4x3 - 3x4 ³ 24

-8x1 + 2x2 – 3x3 - 9x4 ³ 8x1 + 2x2 + 9x3 - 3x4 ³ 31

29x1 + 8x2 + 1x3 + 2x4

Subject to:

Minimize:

Abstract Out Essential Details

Page 8: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

8

Network Flow as a Linear Program

• Given an instance of Network Flow: <G,c<u,v>> express it as a Linear Program: • The variables: • Maximize:• Subject to:

Flows f<u,v> for each edge.

<u,v>: F<u,v> c<u,v>. (Flow can't exceed capacity)v: u F<u,v> = w F<v,w> (flow in = flow out)

rate(F) = u F<u,t> - v F<t,v>

Page 9: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Mi,j XjNi³

subject to

Linear Programming Linear Program

Minimize: CTXSubject to: MX ³ N

• n variable xj that we are looking for values of.• An optimization function

– Each has a variable has a coefficient cj.– The dot product CTX gives one value to minimize.

• m constraints:– Some linear combination of the variables

must be at least some set value.– i MiX ³ Ni

• Generally implied that variables are positive.

Xj

Cj

minimize

Xj ³ 0

Page 10: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Mi,j XjNi³

subject to

Linear Programming Linear Program

Minimize: CTXSubject to: MX ³ N

• These are the linear programs in “standard” form– Minimize CTX hence X subject to X ³ – Maximize CTX hence X subject to X

• But you could mix and match.

Xj

Cj

minimize

Mi,j XjNi

subject to Maximize: CTX

Subject to: MX ³ N Xj

Cj

maximize

³=

=

Page 11: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex Algorithm• Invented by George Dantzig in 1947• A hill climbing algorithm

Local Max

Global Max

Page 12: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex Algorithm• Invented by George Dantzig in 1947• A hill climbing algorithm • Guaranteed to find an global optimal solution for Linear

Programs

Global Max

Page 13: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex Algorithm• Computes solution to a Linear Program by evaluating

vertices where constraints intersect each other.• Worst case exponential time.• Practically very fast.• Ellisoid algorithm (1979)

First poly time  O(n4L) algorithm.

Page 14: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

C1C2(0,0)

x2

x1

Minimize

Cost = 5x1 + 7x2

Constraint Functions

C1: 2x1 + 4x2 ³ 100

C2: 3x1 + 3x2 ³ 90

x1, x2 ³ 0

• With n variables, x1, x2, … , xn, there are n dimensions. (Here n=2)

• Each constraint is an n-1 dimensional plain. (Here the 1-dim line)

• Each point in this space, is a solution.

• It is a valid solution if it is on the correct side of each constraint plain

Simplex Algorithm

Page 15: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

C1C2(0,0)

x2

x1

Cost Function

Cost = 5x1 + 7x2

Constraint Functions

C1: 2x1 + 4x2 ³ 100

C2: 3x1 + 3x2 ³ 90

x1, x2 ³ 0

• Solutions on this line have one value of the objective function

• This has another

• These are not valid

• This is the optimal value

Simplex Algorithm

Page 16: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

C1C2(0,0)

x2

x1

Cost Function

Cost = 5x1 + 7x2

Constraint Functions

C1: 2x1 + 4x2 ³ 100

C2: 3x1 + 3x2 ³ 90

x1, x2 ³ 0

• The arrow tells the direction that the optimal function increases.

• Note that the solution is a vertex (simplex).

• Each simplex is the intersection of n constraints. (Here n = #of variables = 2)

Simplex Algorithm

Page 17: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

• The arrow tells the direction that the optimal function increases.

• Note that the solution is a vertex (simplex).

• Each simplex is the intersection of n constraints. (Here n = #of variables = 2)

• The simplex method takes hill climbing steps, from one simplex (valid solution) to another.

Simplex Algorithm

Page 18: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

• With n variables, x1, x2, x3 … , xn, there are n dimensions. (Here n=3)

• Each constraint is an n-1 dimensional plain. (Here the 2-dim triangles)

• Each simplex (vertex) is the intersection of n such constraints. (Here looks like 6 but generally only n=3)

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

All other constraints must be satisfied.

Page 19: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

• If we slacken one of our n tight constrains,

our solution slides along a 1-dim edge.

• Head in the direction that increases the potential function.

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

Page 20: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

• If we slacken one of our n tight constrains,

our solution slides along a 1-dim edge.

• Head in the direction that increases the potential function.

• Keep sliding until we tighten some constraint.

• This is one hill climbing step.

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

Page 21: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

• If we slacken one of our n tight constrains,

our solution slides along a 1-dim edge.

• Head in the direction that increases the potential function.

• Keep sliding until we tighten some constraint.

• This is one hill climbing step.

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

 

Page 22: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

• Do all Linear Programs have optimal solutions ? No !

• Three types of Linear Programs:

1. Has an optimal solution with a finite cost value: e.g. nutrition problem

2. Unbounded: e.g maximize x, x 5, ³ x 0³

3. Infeasible: e.g maximize x, x 3, x 5 , ³ x 0³

Simplex Algorithm

Page 23: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

23

Primal

Dual

Page 24: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

24

Hill ClimbingWe have a valid solution.(not necessarily optimal)

Take a step that goes up.

measure

progress

Value of our solution.

Problems:

Exit Can't take a step that goes up.

Running time?

Initially have the “zero

Local Max

Global Max

Can our Network Flow Algorithm get stuck in a local maximum?

Make small local changes to your solution toconstruct a slightly better solution.

If you take small step,could be exponential time.

Page 25: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

25

Hill ClimbingAvoiding getting stuck

in a local maximum

Good ExecutionBad Execution

• Made better choices of direction• Hard

• Back up an retry• Exponential time

• Define a bigger step

Page 26: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

26

Network Flow

Can our Simplex Algorithm get stuck in local max?

Need to prove for every linear programfor every choice of steps an optimal solution is found!

No!

How?

Page 27: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

27

Primal-Dual Hill Climbing

Mars settlement has hilly landscapeand many layers of roofs.

Page 28: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

28

Primal-Dual Hill ClimbingPrimal Problem: • Exponential # of locations to stand.• Find a highest one.Dual problem:• Exponential # of roofs.• Find a lowest one.

Page 29: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

29

Primal-Dual Hill ClimbingProve:• Every roof is above every location to stand. R L height(R) height(L) height(Rmin) height(Lmax) Is there a gap?

Page 30: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

30

Primal-Dual Hill Climbing

Prove:• For every location to stand either:• the alg takes a step up or• the alg gives a reason that explains why not

by giving a ceiling of equal height. i.e. L [ L’ height(L’) height(L) or R height(R) = height(L)]

or

But R L height(R) height(L)

No Gap

Page 31: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

31

Primal-Dual Hill Climbing

Prove:• For every location to stand either:• the alg takes a step up or• the alg gives a reason that explains why not

by giving a ceiling of equal height. i.e. L [ L’ height(L’) height(L) or R height(R) = height(L)]

or

Can't go up from this location and no matching ceiling.

Can't happen!

?

Page 32: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

32

Primal-Dual Hill Climbing

Prove:• For every location to stand either:• the alg takes a step up or• the alg gives a reason that explains why not

by giving a ceiling of equal height. i.e. L [ L’ height(L’) height(L) or R height(R) = height(L)]

or

No local maximum!

Page 33: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

33

Primal-Dual Hill Climbing

Claim: Primal and dual have the same optimal value. height(Rmin) = height(Lmax)Proved: R L, height(R) height(L) Proved: Alg runs until it provides Lalg and Ralg

height(Ralg) = height(Lalg)

No Gap

height(Rmin) height(Ralg) = height(Lalg) height(Lmax)height(Rmin) height(Lmax)

Lalg witness that height(Lmax) is no smaller. Ralg witness that height(Lmax) is no bigger.

Exit

Page 34: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Mi,j XjNi³

subject to Linear Program Minimize: CTX

Subject to: MX ³ N

• n variable xj that we are looking for values of.• An optimization function

– Each has a variable has a coefficient cj.– The dot product CTX gives one value to minimize.

• m constraints:– Some linear combination of the variables

must be at least some set value.– i MiX ³ Ni

• Generally implied that variables are positive.

Xj

Cj

minimize

Xj ³ 0

Duality

Page 35: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Mi,j XjNi³

subject to Linear Program Minimize: CTX

Subject to: MX ³ N

• These are the linear programs in “standard” form– Minimize CTX hence X subject to X ³ – Maximize CTX hence X subject to X

• But you could mix and match.

Xj

Cj

minimize

Mi,j XjNi

subject to Maximize: CTX

Subject to: MX ³ N Xj

Cj

maximize

³=

=

Duality

Page 36: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

?

Duality

Dual Linear Program Ni

For every primal linear program, we define its dual linear program.

Everything is turned upside down.• The matrix of coefficients is transposed• For each constraint, a variable.

– Form the objective function vector from the constraint vector.– Generally, the constraint is ‘’ and then the variable is Yi 0

MTj,i

subject to

Yi

Mi,j XjNi³

subject to Primal Linear Program Minimize: CTX

Subject to: MX ³ N Xj

Cj

minimize

Yi

Yi ³ 0

Page 37: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

?

Duality

Dual Linear Program Ni

For every primal linear program, we define its dual linear program.

Everything is turned upside down.• The matrix of coefficients is transposed• For each constraint, a variable.

– Form constraint vector from the objective function vector.– Generally, the constraint is ‘’ and then the variable is Yi 0– But if it is ‘=’, then the variable Yi is unconstrained.

MTj,i

subject to

Yi

Mi,j XjNi³

subject to Primal Linear Program Minimize: CTX

Subject to: MX ³ N Xj

Cj

minimize

Yi

Yi

>=<

0

=

Page 38: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

?

Duality

Dual Linear Program Ni

Cj

For every primal linear program, we define its dual linear program.

Everything is turned upside down.• The matrix of coefficients is transposed• For each constraint, a variable.• For each variable, a constraint.

– Form constraint vector from the objective function vector.• Max Min and

maximize NT.yMT.y CT MT

j,i

subject tomaximize

Yi

Mi,j XjNi³

subject to Primal Linear Program Minimize: CTX

Subject to: MX ³ N Xj

Cj

minimize

Yi

Page 39: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Duality

Dual Linear Program Ni

Cj

For every primal linear program, we define its dual linear program.

maximize NT.yMT.y CT MT

j,i

subject tomaximize

Yi

Mi,j XjNi³

subject to Primal Linear Program Minimize: CTX

Subject to: MX ³ N Xj

Cj

minimize

Yi

Everything is turned upside down.• Max Flow Min Cut• Buyer of nutrients Seller of nutrients in fruit in vitamins

Page 40: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Duality

Dual Linear Program Ni

Cj

For every primal linear program, we define its dual linear program.

maximize NT.yMT.y CT MT

j,i

subject tomaximize

Yi

Mi,j XjNi³

subject to Primal Linear Program Minimize: CTX

Subject to: MX ³ N Xj

Cj

minimize

Yi

Every solution X of the primal is above every solution Y of the primal. X Y CTX NTY CXmin NYmax We will prove equality.

Dual of the dual is itself!

Page 41: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

The Nutrition Problem• Each fruit

contains different nutrients

• Each fruit has different cost

An apple a day keeps the doctor away – but apples are costly!

A customer’s goal is to fulfill daily nutrition requirements at lowest cost.

Page 42: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

The Nutrition Problem (cont’d)

• Let’s take a simpler case of just apples and bananas.

• Must take at least 100 units of Calories & 90 units of Vitamins for good nutrition.

• A customer’s goal is to buy fruits in such a quantity that it minimizes cost but fulfills nutrition.

Calories Vitamins Cost($)

2 3 5

4 3 7

Cost Function

Cost = 5x1 + 7x2

Constraint Functions

C1: 2x1 + 4x2 ³ 100

C2: 3x1 + 3x2 ³ 90

x1, x2 ³ 0

Page 43: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

The Nutrition Problem (cont’d)

• Matrix Representation

Constraints:

2x1 + 4x2 ³ 100

3x1 + 3x2 ³ 90

Non-negativity: x1, x2 ³ 0

Cost function = 5x1 + 7x2

Real life problems may have many variables and constraints !

Cj

XjNiMi,j ³Xj

Page 44: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Semantics of Duality

A customer’s goal is to buy fruits in such a quantity that it minimizes cost but fulfills nutrition.

Cj

XjNiMi,j ³Xj

Primal LP: minimize C.x

Q.x ³ N

Coefficients in each column represent the amount of nutrients in a particular food

Cost of each fruitDaily nutrition

Quantity of each fruit

Page 45: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Semantics of Duality

Dual LP:maximize NT.y

QT.y CT

Coefficients in each row represent the amount of nutrients in a particular fruit

NiYi CjMj,i

Yi

But what are Yis in the dual ? Price of

each nutrient!

Daily nutritionCost of each fruit

Imagine a salesman trying to sell supplements for each fruit.

Page 46: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Semantics of Duality• Primal Problem: A customer’s goal is to buy fruits in such

a quantity that it minimizes cost but fulfills nutrition.

• Dual Problem: A salesman goal is to set a price on each nutrient, so that it maximizes profit but his supplements are cheaper than fruits. (Otherwise who will buy them?!)

Primal (Customer)

Dual (Salesman)

Page 47: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

47

Primal-Dual Hill Climbing

Prove:• For every solution x of the primal either:• the alg takes a step up to a better primal• the alg gives a reason that explains why not

by giving a solution y of the dual of equal value. i.e. x [ x’ CTx’ > CTx or y CTx = NTy]

or

But x y Cx Ny

No Gap

Page 48: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

48

Primal-Dual Hill Climbing

or

Can't go up from this location and no matching ceiling.

Can't happen!

?

Prove:• For every solution x of the primal either:• the alg takes a step up to a better primal• the alg gives a reason that explains why not

by giving a solution y of the dual of equal value. i.e. x [ x’ CTx’ > CTx or y CTx = NTy]

Page 49: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

49

Primal-Dual Hill Climbing

or

No local maximum!

Prove:• For every solution x of the primal either:• the alg takes a step up to a better primal• the alg gives a reason that explains why not

by giving a solution y of the dual of equal value. i.e. x [ x’ CTx’ > CTx or y CTx = NTy]

Page 50: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

50

Primal-Dual Hill Climbing

Claim: Primal and dual have the same optimal value. CTxmin = NTymax

No Gap

xalg witnesses that CTxmin is no smaller. yalg witnesses that CTxmin is no bigger.

Exit

Page 51: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

• Given any solution x of the primal.Simplex

Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

All other constraints must be satisfied.

Page 52: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

• Given any solution x of the primal.

• If we slacken one of our n tight constrains, our solution slides along a 1-dim edge.

• Head in the direction that increases the potential function.

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

Page 53: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

• Given any solution x of the primal.

• If we slacken one of our n tight constrains, our solution slides along a 1-dim edge.

• Head in the direction that increases the potential function.

• Keep sliding until we tighten some constraint. Giving new solution x’

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

Page 54: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

• Given any solution x of the primal.

• If we slacken one of our n tight constrains, our solution slides along a 1-dim edge.

• Head in the direction that increases the potential function.

• Keep sliding until we tighten some constraint. Giving new solution x’

Page 55: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

• Given any solution x of the primal.

• Step to another x OR

• None of these “steps” increases the potential function.• the alg gives a reason that explains why not

by giving a solution y of the dual of equal value.

Page 56: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Maximize Cost : 21x1 - 6x2 – 100x3

Constraint Functions:5x1 + 2x2 +31x3 211x1 - 4x2 +3x3 566x1 + 60x2 - 31x3 200 ⁞-5x1 + 3x2 +4x3 8 ⁞x1, x2, x3 ³ 0

Simplex Algorithm

A simplex is specifiedby a subset of n of the m tight (=) constraints.

 

But practically it tends to be fast.

Bread and butter of optimization in industry.

Page 57: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

57

Thank You!

Questions ?

End

Page 58: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

• Do all Linear Programs have optimal solutions ? No !

• Three types of Linear Programs:

1. Has an optimal solution with a finite cost value: e.g. nutrition problem

2. Unbounded: e.g maximize x, x 5, ³ x 0³

3. Infeasible: e.g maximize x, x 3, x 5 , ³ x 0³

Simplex Algorithm

Page 59: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex

C1C2(0,0)

y

x

Cost Function

P = 5x + 7y

Constraint Functions

C1: 2x + 4y 100

C2: 3x + 3y 90

Non-Negativity: x,y ³ 0

Recall we need to evaluate our cost

function at the vertices where the constraint

functions intersect each other

Page 60: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex (cont’d)

Our Equations

P = 5x + 7y

C1: 2x + 4y 100

C2: 3x + 3y 90

x,y ³ 0

Slack Form

Can be re-written as:

P = 5x + 7y

s1 = 100 - 2x - 4y

s2 = 90 - 3x - 3y

x, y ³ 0

s1,, s2 0³We introduce 2 new

variables called slack variables

We don’t want to deal with complex inequalities

Page 61: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex (cont’d)Cost Function

P = 5x + 7y

s1 = 100 - 2x - 4y

s2 = 90 - 3x - 3y

s1, , s2 , x , y ³ 0

STEP 1:

• We want an initial point

• Let’s put x=0, y=0

Feasible solutionx=0, y=0

P = 0

Page 62: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex (cont’d)Cost Function

P = 5x + 7y

s1 = 100 - 2x - 4y

s2 = 90 - 3x - 3y

s1, , s2 , x , y ³ 0

STEP 2:

• We want next point

• Let's try to increase x.

• x can be increased maximum to 30 (s2 becomes zero)

• Rewrite equations (Pivoting)

• Now put y, s2 = 0

Feasible solution

x = 30 – y – s2/3

s1 = 40 + 2/3s2 – 2y

P = 150 – 5/3s2 + 2y

x=30, y=0

P = 150

Page 63: 1 Linear Programming Jeff Edmonds York University COSC 3101 Lecture 5 Def and Hot Dog Example Network Flow Def nNetwork Flow Def n Matrix View of Linear.

Simplex (cont’d)Cost Function

P = 150 – 5/3s2 + 2y

x = 30 – y – s2/3

s1 = 40 + 2/3s2 – 2y

s1, , s2 , x , y ³ 0

STEP 3:

• We want next point

• Let's try to increase y.

• y can be increased maximum to 20 (s1 becomes zero)

• Rewrite equations (Pivoting)

• Now put s1, s2 = 0

y = 20 + 1/3s2 – 1/2s1

x = 10 – 1/2s1 - 2/3s2

P = 190 - s1 – s2

x=10, y=20

P = 190

(We don’t increase s2 because it will decrease

P)

Note that we cannot increase s1 & s2 without

decreasing P. So we stop !

Feasible solution

Is this solution optimal? Or have

we run into a local minimum?

?