Lesson 28 (Sections 18.2–5) Lagrange Multipliers II Math 20 November 28, 2007 Announcements I Problem Set 11 assigned today. Due December 5. I next OH: Today 1–3 (SC 323) I Midterm II review: Tuesday 12/4, 7:30-9:00pm in Hall E I Midterm II: Thursday, 12/6, 7-8:30pm in Hall A
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Lesson 28 (Sections 18.2–5)Lagrange Multipliers II
Math 20
November 28, 2007
Announcements
I Problem Set 11 assigned today. Due December 5.
I next OH: Today 1–3 (SC 323)
I Midterm II review: Tuesday 12/4, 7:30-9:00pm in Hall E
I Midterm II: Thursday, 12/6, 7-8:30pm in Hall A
Outline
A homework problem
Restating the Method of Lagrange MultipliersStatementJustifications
Second order conditionsCompact feasibility setsAd hoc argumentsAnalytic conditions
Example: More than two variables
More than one constraint
Problem 17.1.10
ProblemMaximize the quantity f (x , y , z) = Axaybzc subject to theconstraint that px + qy + rz = m. (Here A, a, b, c , p, q, r ,m arepositive constants.)
Solution (By elimination)
Solving the constraint for z in terms of x and y, we get
z =m − px − qy
r
So we optimize the unconstrained function
f (x , y) =A
r cxayb(m − px − qy)c
Problem 17.1.10
ProblemMaximize the quantity f (x , y , z) = Axaybzc subject to theconstraint that px + qy + rz = m. (Here A, a, b, c , p, q, r ,m arepositive constants.)
Solution (By elimination)
Solving the constraint for z in terms of x and y, we get
So throwing out the critical points where x = 0, y = 0, or z = 0(these give minimal values of f , not maximal), we get
(a + c)px + aqy = am
bpx + (b + c)qy = bm
This is a fun exercise in Cramer’s Rule:
x =
∣∣∣∣am aqbm (b + c)q
∣∣∣∣∣∣∣∣(a + c)p aqbp (b + c)q
∣∣∣∣ =
amq
∣∣∣∣1 1b b + c
∣∣∣∣pq
∣∣∣∣a + c ab b + c
∣∣∣∣=
amqc
pq(ac + bc + c2)=
m
p
a
a + b + c
It follows that
y =m
q
b
a + b + cz =
m
r
c
a + b + c
If this is a utility-maximization problem subject to fixed budget,the portion spent on each good (px
m , for instance) is the relativedegree to which that good multiplies utility ( a
a+b+c ).
Outline
A homework problem
Restating the Method of Lagrange MultipliersStatementJustifications
Second order conditionsCompact feasibility setsAd hoc argumentsAnalytic conditions
Example: More than two variables
More than one constraint
Theorem (The Method of Lagrange Multipliers)
Let f (x1, x2, . . . , xn) and g(x1, x2, . . . , xn) be functions of severalvariables. The critical points of the function f restricted to the setg = 0 are solutions to the equations:
∂f
∂xi(x1, x2, . . . , xn) = λ
∂g
∂xi(x1, x2, . . . , xn) for each i = 1, . . . , n
g(x1, x2, . . . , xn) = 0.
Note that this is n + 1 equations in n + 1 variables x1, . . . , xn, λ.
Graphical Justification
In two variables, the critical points of f restricted to the level curveg = 0 are found when the tangent to the the level curve of f isparallel to the tangent to the level curve g = 0.
These tangents have slopes(dy
dx
)f
= − f ′xf ′y
and
(dy
dx
)g
= −g ′xg ′y
So they are equal when
f ′xf ′y
=g ′xg ′y
=⇒ f ′xg ′x
=f ′yg ′y
or
f ′x = λg ′x
f ′y = λg ′y
These tangents have slopes(dy
dx
)f
= − f ′xf ′y
and
(dy
dx
)g
= −g ′xg ′y
So they are equal when
f ′xf ′y
=g ′xg ′y
=⇒ f ′xg ′x
=f ′yg ′y
or
f ′x = λg ′x
f ′y = λg ′y
Symbolic Justification
Suppose that we can use the relation g(x1, . . . , xn) = 0 to solve forxn in terms of the the other variables x1, . . . , xn−1, after makingsome choices. Then the critical points of f (x1, . . . , xn) areunconstrained critical points of f (x1, . . . , xn(x1, . . . , xn−1)).
f
x1 x2 · · · xn
x1 x2 · · · xn−1
Now for any i = 1, . . . , n − 1,(∂f
∂xi
)g
=∂f
∂xi+
∂f
∂xn
(∂xn
∂xi
)g
=∂f
∂xi− ∂f
∂xn
∂g/∂xi
∂g/∂xn
If(∂f∂xi
)g
= 0, then
∂f /∂xi
∂f /∂xn=∂g/∂xi
∂g/∂xn⇐⇒ ∂f /∂xi
∂g/∂xi=∂f /∂xn
∂g/∂xn
So as before,∂f
∂xi= λ
∂g
∂xifor all i .
Another perspective
To find the critical points of f subject to the constraint thatg = 0, create the lagrangian function
L = f (x1, x2, . . . , xn)− λg(x1, x2, . . . , xn)
If L is restricted to the set g = 0, L = f and so the constrainedcritical points are unconstrained critical points of L . So for each i ,
∂L
∂xi= 0 =⇒ ∂f
∂xi= λ
∂g
∂xi.
But also,∂L
∂λ= 0 =⇒ g(x1, x2, . . . , xn) = 0.
Outline
A homework problem
Restating the Method of Lagrange MultipliersStatementJustifications
Second order conditionsCompact feasibility setsAd hoc argumentsAnalytic conditions
Example: More than two variables
More than one constraint
Second order conditions
The Method of Lagrange Multipliers finds the constrained criticalpoints, but doesn’t determine their “type” (max, min, neither).So what then?
A dash of topologyCf. Sections 17.2–3
DefinitionA subset of Rn is called closed if it includes its boundary.
x2 + y2 ≤ 1closed
x2 + y2 ≤ 1not closed
y ≥ 0closed
Basically, if a subset is described by ≤ or ≥ inequalities, it is closed.
A dash of topologyCf. Sections 17.2–3
DefinitionA subset of Rn is called closed if it includes its boundary.
x2 + y2 ≤ 1closed
x2 + y2 ≤ 1not closed
y ≥ 0closed
Basically, if a subset is described by ≤ or ≥ inequalities, it is closed.
DefinitionA subset of Rn is called bounded if it is contained within someball centered at the origin.
x2 + y2 ≤ 1bounded
x2 + y2 ≤ 1bounded
y ≥ 0not bounded
DefinitionA subset of Rn is called compact if it is closed and bounded.
x2 + y2 ≤ 1compact
x2 + y2 ≤ 1not compact
y ≥ 0not compact
Optimizing over compact sets
Theorem (Compact Set Method)
To find the extreme values of function f on a compact set D ofRn, it suffices to find
I the (unconstrained) critical points of f “inside” D
I the (constrained) critical points of f on the “boundary” of D.
Ad hoc arguments
If D is not compact, sometimes it’s still easy to argue that as xgets farther away, f becomes larger, or smaller, so the criticalpoints are “obviously” maxes, or mins.
(Example later)
Ad hoc arguments
If D is not compact, sometimes it’s still easy to argue that as xgets farther away, f becomes larger, or smaller, so the criticalpoints are “obviously” maxes, or mins.(Example later)
I For the two-variable constrained optimization problem, wehave (look in the book if you want the gory details):
(d2f
dx2
)g
=
∣∣∣∣∣∣0 g ′x g ′yg ′x f ′′xx − λg ′′xx f ′′xy − λg ′′xyg ′y f ′′yx − λg ′′yx g ′′yy − λg ′′yy
∣∣∣∣∣∣ =
∣∣∣∣∣∣L ′′λλ L ′′
λx L ′′λy
L ′′xλ L ′′
xx L ′′xy
L ′′yλ L ′′
yx L ′′yy
∣∣∣∣∣∣The critical point is a local max if this determinant isnegative, and a local min if this is positive.
I The matrix on the right is the Hessian of the Lagrangian. Butthere is still a distinction between this and the unconstrainedcase. The constrained extrema are critical points of theLagrangian, not extrema.
I Don’t worry too much about this!
Outline
A homework problem
Restating the Method of Lagrange MultipliersStatementJustifications
Second order conditionsCompact feasibility setsAd hoc argumentsAnalytic conditions
Example: More than two variables
More than one constraint
Problem 17.1.10
ProblemMaximize the quantity f (x , y , z) = Axaybzc subject to theconstraint that px + qy + rz = m. (Here A, a, b, c , p, q, r ,m arepositive constants.)
SolutionThe Lagrange equations are
Aaxa−1ybzc = λp
Abxayb−1zc = λq
Acxaybzc−1 = λr
We rule out any solution with x, y , z, or λ equal to 0 (they willminimize f , not maximize it).
Problem 17.1.10
ProblemMaximize the quantity f (x , y , z) = Axaybzc subject to theconstraint that px + qy + rz = m. (Here A, a, b, c , p, q, r ,m arepositive constants.)
SolutionThe Lagrange equations are
Aaxa−1ybzc = λp
Abxayb−1zc = λq
Acxaybzc−1 = λr
We rule out any solution with x, y , z, or λ equal to 0 (they willminimize f , not maximize it).
Dividing the first two equations gives
ay
bx=
p
q=⇒ y =
bp
aqx
Dividing the first and last equations gives
az
cx=
p
r=⇒ z =
cp
arx
Plugging these into the equation of constraint gives
px +bp
ax +
cp
ax = m =⇒ x =
m
p
a
a + b + c
Outline
A homework problem
Restating the Method of Lagrange MultipliersStatementJustifications
Second order conditionsCompact feasibility setsAd hoc argumentsAnalytic conditions
Example: More than two variables
More than one constraint
General method for more than one constraint
If we are optimizing f (x1, . . . , xn) subject to gj(x1, . . . , xn) ≡ 0,j = 1, . . . ,m we need multiple lambdas for them. The newLagrangian is
L (x1, . . . , xn) = f (x1, . . . , xn)−m∑
j=1
λjgj(x1, . . . , xn)
The conditions are that ∂L∂xi
= 0 and ∂L∂λj
= 0 for all i and j . In
other words,
∂f
∂xi= λ1
∂g1
∂xi+ · · ·+ λm
∂gm
∂xi(all i)
gj(x1, . . . , xn) = 0 (all j)
Example
Find the minimum distance between the curves xy = 1 andx + 2y = 1.
Reframing this, we can minimize
f (x , y , u, v) = (x − u)2 + (y − v)2
subject to the constraints
xy − 1 = 0 u + 2v = 1.
Example
Find the minimum distance between the curves xy = 1 andx + 2y = 1.