Top Banner
7/24/2019 06logisticregression 150930040919 Lva1 App6891 http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 1/44 Logistic Regression A Classifcation Algorithm One o the most popular and most widely used learning algorithm tod
44

06logisticregression 150930040919 Lva1 App6891

Feb 21, 2018

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 1/44

Logistic RegressionA Classifcation Algorithm

One o the most popular and mostwidely used learning algorithm tod

Page 2: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 2/44

0: “Negatie Class! "e#g#$ %tumor& 

1: “'ositie Class 1! "e#g#$

tumor&(: “'ositie Class (! e# #$

Classifcation

*mail: +pam , Not +pam-

Online .ransactions: /raudulent "es ,No&- .umor: alignant , 2enign -

0: “Negatie Class! "e#g#$ %tumor& 

1: “'ositie Class! "e#g#$ matumor&

 

Page 3: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 3/44

 .hreshold classifer outputat 0#4:

5 $ predict “y 6 1!

5 $ predict “y 6 0!

Page 4: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 4/44

 .hreshold classifer outputat 0#4:

5 $ predict “y 6 1!

5 $ predict “y 6 0!

Page 5: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 5/44

Bad thing to do or linear regression

2eore we 9ust got lucy;

Page 6: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 6/44

Classifcation: y 6 0or 1

can %e < 1 or= 0

Logistic Regression:

Classifcation task 

Page 7: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 7/44

Page 8: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 8/44

+igmoid unctionLogistic unction

Logistic Regression Model

ant

1

0#

4

0

Need to select parameters so tha

o it with an algorithm later

Page 9: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 9/44

Interpretation o HypothesisOutput

6 estimated pro%a%ility that y 6 1 on new input D

 .ell patient that @0E chance o tumor %ein

malignant

*Dample:5

“pro%a%ility that y 6 gien D$  parameteriFed %y

 

Page 10: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 10/44

ecision %oundary

Page 11: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 11/44

Logistic regression

  5 

5

or

5

H

Page 12: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 12/44

Page 13: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 13/44

 eDample with eatures D1$ D( that satisy this eIuation p

 

Page 14: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 14/44

D1

D(

Decision Boundary

1 ( 7

1

(

7

'redict “ “ i

 

Page 15: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 15/44

Non-linear decisionoundaries

D1

D(

D1

D(

'redict “ “ i

1J1

J1

1

Page 16: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 16/44

Cost unction

 .o ft the parameters

Page 17: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 17/44

 .rainingset:

?ow to choose

parameters -

m eDamples

Page 18: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 18/44

Cost unction

Linear regression:

“nonJconeDunction!

“coneDunction!

Logistic

 KKKKK 

L i ti i t

Page 19: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 19/44

Logistic regression costunction

5 y 6 1

10

Cost

Page 20: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 20/44

Logistic regression costunction

5 y 6 1

10

Cost

Page 21: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 21/44

Logistic regression costunction

5 y 6 0

10

Cost

Page 22: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 22/44

Page 23: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 23/44

+implifed cost unction andgradient descent

Page 24: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 24/44

Page 25: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 25/44

hy do we chose this unction whe

other cost unctions eDist-•  .his cost unction can %e deried

statistics using the principle o

!a"i!u! likelihood esti!ati – An ecient method to fnd parame

data or diMerent models

 – 5t is a coneD unction

Logistic regression costunction

Page 26: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 26/44

Output

Logistic regression costunction

 .o ft parameters :

 .o mae a prediction gien new :

?ypothesis estimapro%a%ility that y6

Page 27: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 27/44

#radient Descent

ant :

Repeat

"simultaneously update all &

 

Page 28: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 28/44

#radient Descent

ant :

"simultaneously update all &

Repeat

Algorithm loos identical to linear regression

2ut actually they are ery diMerent rom each

Page 29: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 29/44

Hypothesis

 

Cost unction

#radient Descent

Page 30: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 30/44

Adanced optimiFatio

Page 31: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 31/44

Opti!i$ation algorith!

Cost unction # ant #

ien $ we hae code that cancompute-  -  

"or&

Repeat

radient descent:

Page 32: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 32/44

Opti!i$ation algorith!

ien $ we hae code that cancompute

-  "or

&Opti!i$ationalgorith!s%

-

radient descent-

NewtonJRaphsons method- Con9ugate gradient- 2/+ "2roydenJ/letcherJoldar%J+hann- LJ2/+ "Limited memory J 2/+&

P N d t ll i l h "l ii$ation algorith!% Con9ugate gradient$ 2/+

Page 33: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 33/44

P No need to manually pic alpha "learning raP ?ae a cleer inner loop "line search algo

which tries a %unch o alpha alues and p

good oneP Oten aster than gradient descentP Can %e used successully without understan

compleDity

 Q  ery complicated Q  Could mae de%ugging more dicult Q  +hould not %e implemented themseles "im

only i you are an eDpert in numerical compu Q  iMerent li%raries may use diMerent implem

J

i$ation algorith!% Con9ugate gradient$ 2/+

* l

Page 34: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 34/44

*Dample: function [jVal, grad

= costFunc

jVal = (theta(1)-5

(theta(2)-

gradient = "eros(

gradient(1) = 2#(

gradient(2) = 2#(

o$tions = o$ti%set(&'radj*, &on*, &ater*, &initial/heta = "eros(2,1)!

[o$t/heta, functionVal, eitFlag]

  = f%inunc(0costFunction, initial/heta, o$tio

ction !ini!i$ation unconstrained

Page 35: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 35/44

Page 36: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 36/44

Page 37: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 37/44

gradient(1) = [ ]!

function [jVal, gradient] = costFunction(thet

theta =

jVal = [ ]!

gradient(2) = [ ]!

gradient(n+1) = [ ]

code tocompute

code to

computecode tocompute

code to

compute

pti!i$ation algorith!% !inunc

Page 38: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 38/44

• Notice that %y using minunc$ you did not h

write any loops yoursel • or set a learning rate lie you did or gradiedescent#

•  .his is all done %y minunc• you only needed to proide a unction calc

the cost and the gradient#

pti!i$ation algorith!% !inunc

'rediction

Page 39: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 39/44

Once you hae optimiFed $ compute:

5 then

else

 

'rediction

Page 40: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 40/44

ultiJclass classifcatio

OneJsJall algorithm

Multiclass classifcation

Page 41: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 41/44

Multiclass classifcation

*mail oldering,tagging: or$ /riends$ /amil?o%%y

edical diagrams: Not ill$ Cold$ /lu

eather: +unny$ Cloudy$ Rain$ +now

Page 42: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 42/44

D1

D(

D1

D(

2inaryclassifcation:

ultiJclassclassifcation:

One-(s-all )one-

Page 43: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 43/44

D1

D(

One-(s-all )one-(s-rest*%

Class 1:Class (:Class 7:

One (s all

Page 44: 06logisticregression 150930040919 Lva1 App6891

7/24/2019 06logisticregression 150930040919 Lva1 App6891

http://slidepdf.com/reader/full/06logisticregression-150930040919-lva1-app6891 44/44

One-(s-all

 .rain a logistic regression classifer

or each class to predict the pro%a%ilthat #On a new input $ to mae aprediction$ pic the class that

maDimiFes