Top Banner

Click here to load reader

Introduction to introduction to introduction to Optimization

Feb 20, 2016

ReportDownload

Documents

herb

Introduction to introduction to introduction to … Optimization . Leonhard Euler. … since the fabric of the universe is most perfect and the work of a most wise Creator, nothing at all takes place in the universe in which some rule of maximum or minimum does not appear. Boredom. - PowerPoint PPT Presentation

PowerPoint Presentation

since the fabric of the universe is most perfect and the work of a most wise Creator, nothing at all takes place in the universe in which some rule of maximum or minimum does not appear.

Introduction to introduction to introduction to

Optimization Leonhard EulerBoredomLecture timeUnderstanding1 min 10 mins 30 mins 1 hour Optimal listening time for a talk: 8 minutes 25 seconds *heighttimeAction at a point := Kinetic Energy Potential Energy.Action for a path := Integrate action at points over the path.

Nature chooses the path of least action!

Pierre Louis Moreau de Maupertuis

A + BC AB + CReference: Holy grails of Chemistry, Pushpendu Kumar Das. Acknowledgement: Deepika Viswanathan, PhD student, IPC, IISc

The path taken between two points by a ray of light is the path that can be traversed in the least time

Fermat For all thermodynamic processes between the same initial and final state, the delivery of work is a maximum for a reversible process

Gibbs

William of OckhamAmong competing hypotheses, the hypothesis with the fewest assumptions should be selected.

Travelling Salesman Problem (TSP)Courtesy: xkcd

A hungry cow is at position (2,2) in a open field.It is tied to a rope that is 1 unit long.Grass is at position (4,3)A perpendicular electric fence passes through the point (2.5,2)How close can the cow get to the fodder?0 1 2 3 4 5

4

3

2

1

0

What do we want to find?Position of cow: Let (x,y) be the solution.

What do we want to be solution to satisfy?Grass: min (x-4)^2 + (y-3)^2

What restrictions does the cow have?Rope: (x-2)^2 + (y-2)^2 0 , want d < 0 Cur_X < 0, want d > 0Derivative at Cur_X: 2(Cur_X)

Negative of the derivative does the trick !

Y = X^2+2 X YExample: Cur_X = 0.5d = Negative derivative(Cur_X) = - 2(0.5) = -1

New_X = Cur_X + d = 0.5 -1 = -0.5Update: Cur_X = New_X = -0.5

d = Negative derivative(Cur_X) = -2(-0.5) = 1New_X = Cur_X + d = -0.5 + 1 = 0.5What was the problem? Step SizeThink: How should you modify step size at every step to avoid this problem?

0 1 2 3 4 5

4

3

2

1

0Objective : (x-4)^2 + (y-3)^2

x yAlgorithm: Gradient descentStart at any position Cur.Find gradient at CurCur = Cur (stepSize)*gradient Repeat

Gradient is the generalization of derivative to higher dimensionsNegative gradient at (2,2) = (4,2)Points towards grass!

Negative gradient at (1,5) = (6,-4)Points towards grass!Gradient descent is the simplest unconstrained optimization procedure. Easy to implement.

If stepSize is chosen properly, it will provably converge to a local minimum

Think: Why doesnt the gradient descent algorithm always converge to a global minimum?

Think: How to modify the algorithm to find a local maximum?

Host of other methods which pick the direction differently Think: Can you come up with a method that picks a different direction than just the negative gradient?

Gradient descent - summaryConstrained Optimization

0 1 2 3 4 5

4

3

2

1

0

Real world problems are rarely unconstrained!

Need to understand gradients betterto understand how to solve them.Let us begin with the Taylor series expansion of a function.

For small enough , we have

What should the value of d be such that is as small as possible?

The negative derivative is the direction of maximum descent.

Important: Any direction such that is a descent direction !Functions of one variable

Functions of many variables

Any direction such that is a descent direction

Constrained OptimizationMinimize f(x)

Such that g(x) = 0Say somebody gives you a point x* and claims it is the solution to this problem.

What properties should this point satisfy?

- Must be feasible g(x*) = 0 - There must NOT be a descent direction that is also feasible!

Given a point x,

Descent direction: Any direction which will lead to a point x such that f(x) < f(x)

Feasible direction: Any direction which will lead to a point x such that g(x) = 0There must NOT be a descent direction that is also feasible!Minimize f(x)

Such that g(x) = 0

Minimize f(x)

Such that g(x) = 0

Constrained Optimization ProblemUnconstrained Optimization Problem

What we did not cover?Constrained optimization with multiple constraintsConstrained optimization with inequality constraintsKarush-Kuhn-Tucker (KKT) conditions

Linear ProgramsConvex optimizationDuality theory

etc etc etc SummaryOptimization is a very useful branch of applied mathematicsVery well studied, yet there are numerous problems to work on

If interested, we can talk more Thank you !

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.