Top Banner
A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent to training a linear network with weight decay.
6

A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent.

Dec 20, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent.

A simple classifier

Ridge regression A variation on standard linear regression

Adds a “ridge” term that has the effect of

“smoothing” the weights

Equivalent to training a linear network with weight decay.

Page 2: A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent.

A “Strong” Classifier:SNoW– Sparse Network of Winnows

• Roth et al. 2000 – Currently best reported face detector• 1. Turn each pixel into a sparse, binary vector

• 2. Activation = sign( )• 3. Train with the Winnow update rule

wixi∑

Page 3: A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent.

AdaBoost for Feature Selection

Viola and Jones (2001) used AdaBoost as a feature selection method

For each round of AdaBoost:

For each patch, train a classifier using only that one patch.

Select the best one as the classifier for this round

reweight distribution based on that classifier.

Page 4: A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent.

Results

.00%

.20%

.40%

.60%

.80%

Single SNoWSNoW + BaggingRidge + AdaBoostSNoW + AdaBoost

SNoW + Bagging + patchesSNoW + AdaBoost + patchesRidge+AdaBoost+ patches

East

Page 5: A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent.

AdaBoost consistently improves performance

0 %

5 %

10 %

15 %

20 %

25 %

Global +Ridge

Global +SNoW

Patches +Ridge

Patches +SNoW

Single SystemBaggingAdaBoost

Page 6: A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent.

AdaBoost consistently improves performance

0 %

1 %

2 %

3 %

4 %

5 %

6 %

7 %

Global + SNoW Patches + Ridge Patches + SNoW

Single SystemBaggingAdaBoost