Top Banner
24

Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Aug 01, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6
Page 2: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Decision Trees

• This week:

– Algorithms for constructing DT

• Next week:

– Pruning DT

– Ensemble methods

• Random Forest

2 Intro to ML

Page 3: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Decision Trees - Boolean

x1

x6

+1 -1

+1

1

1

0

0

3 Intro to ML

Page 4: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Decision Trees - Continuous

x1 > 5

x2 > 2

+1 -1

+1

4 Intro to ML

X1 > 5

h=+1

X1 ≤ 5 X2 > 2

h= -1

X1 ≤ 5, X2 ≤ 2 h=+1

T

T

F

F

Decision stumps

Page 5: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Decision Trees: Basic Setup

• Basic class of hypotheses H. – For example H={xi} or H={xi>a}

• Input: Sample of examples – S={(x,b)}

• Output: Decision tree – Each internal node from H – Each leaf a classification value

• Goal (Occam Razor): – Small decision tree – Classifies all (most) examples correctly.

5 Intro to ML

Page 6: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Decision Tree: Why?

• Human interpretability

• Efficient algorithms:

– Construction.

– Classification

• Performance: Reasonable

• Software packages:

– CART

– C4.5 and C5

6 Intro to ML

Page 7: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Decision Trees Algorithm: Outline • A natural recursive procedure.

• Decide a predicate h at the root.

• Split the data using h

• Build right subtree (for h(x)=1)

• Build left subtree (for h(x)=0)

• Running time

– Time(m) = O(m) + Time(m+) + Time(m-) ≈ O(m log m) • Tree size <m = sample size

7 Intro to ML

Page 8: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

DT: Selecting a Predicate

• Basic setting:

• Clearly: q=up + (1-u)r

h

Pr[f=1]=q

Pr[f=1| h=0]=p Pr[f=1| h=1]=r

0 1

Pr[h=0]=u Pr[h=1]=1-u

v

v1 v2

8 Intro to ML

Page 9: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Potential function: setting • Compare predicates using potential function.

– Inputs: q, u, p, r – Output: value

• Node dependent: – For each node v and predicate h assign a value.

• val(v)= val(u,p,r) – Q: what about q ?! What about the probability of reaching v ?!

– Given a split: val(v) = u val(v1) + (1-u) val(v2) – For a tree: weighted sum over the leaves.

• Val(T) = ∑v leaf qv val(v)

9 Intro to ML

Page 10: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

PF: classification error • Misclassification potential

– val(v)=min{q,1-q}

• Classification error.

• val(T) = fraction of errors using T on sample S – In leaves, select the error minimizing label

• Termination:

– Perfect Classification

• Val(T)=0

• Dynamics: The potential only drops

10 Intro to ML

Page 11: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

PF: classification error

• Initial error 0.2

• After split 0.5 (0.4) + 0.5(0) = 0.2

• Is this a good split?

h

q=Pr[f=1]=0.8

p=Pr[f=1| h=0]=0.6

0 1

u=Pr[h=0]=0.5

v

v1 v2

1-u=Pr[h=1]=0.5

r=Pr[f=1| h=1]=1

11 Intro to ML

Page 12: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Potential Function: requirements

• Every change in an improvement

• When zero perfect classification.

• Strictly convex.

p q r

u 1-u

val(q)

val(p)

val(r)

12 Intro to ML

Page 13: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Potential Functions: Candidates

• Assumption on val: – Symmetric: val(q) = val(1-q)

– Convex

– val(0)=val(1) = 0 and val(1/2) =1/2

• Outcome: – Error(T) ≤ val(T)

– Minimizing val(T) upper bounds the error!

13 Intro to ML

Page 14: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Potential Functions: Candidates

• Potential Functions: – val(q) = Gini(q)=2q(1-q) CART

– val(q)=Etropy(q)= -q log q –(1-q) log (1-q) C4.5

– val(q) = sqrt{2 q (1-q) } Variance

• Differences: – Slightly different behavior

– Same high level intuition

14 Intro to ML

Page 15: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

DT: Construction Algorithm

Procedure DT(S) : S - sample

• If all the examples in S have the classification b

– Create a leaf with label b and return

• For each h compute val(h,S)

– val(h,S) = uhval(ph) + (1-uh) val(rh)

• Let h’ = arg minh val(h,S)

• Split S using h’ to S0 and S1

• Recursively invoke DT(S0) and DT(S1)

• Q: What about termination?! What is the running time ?!

15 Intro to ML

Page 16: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Run of the algorithm

• Function val=2q(1-q)

• Basic hypothesis: attrib.

• Initially: val =0.5

• At the root:

– X1: (8,5) & (2,0) • Val= 0.8*2*(5/8)(3/8)+0.2*0

=0.375

– X2: (2,2) & (8,3) • Val=0.2*0+0.8*2*3/8*5/8

=0.375

• Example

x - y x - y

11110- 1 10011 - 0

10010- 1 10111 - 0

11111- 1 10011 - 0

10001- 1 00100 - 0

10101 -1 00000 - 0

16 Intro to ML

Page 17: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Run of the algorithm

• At the root: – X3: (5,3) & (5,2)

• Val= 0.5*2*3/5*2/5+ 0.5*2*2/5*3/5=0.48

– X4: (6,3) & (4,2) • Val=0.6*2*0.5*0.5+0.4*2

*0.5*0.5=0.5

– X5: (6,3) & (4,2) • Val=0.5

• Select x1 – Reduction:

0.5- 0.375= 0.125

• Example

x - y x - y

11110- 1 10011 - 0

10010- 1 10111 - 0

11111- 1 10011 - 0

10001- 1 00100 - 0

10101 -1 00000 - 0

17 Intro to ML

Page 18: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Run of the algorithm

• Root x1

• Split the sample • For x1=0 DONE ! (why?) • For x1=1 continue.

– What about val(x1)?! – For x2 (2,2) & (6,3) 0.375 – For x3 (4,3) & (4,2) 0.4375 – For x4 (6,3) & (2,2) 0.375 – For x5 (6,3) & (2,2) 0.375

• Select x2 – Reduction 0.375-0.8*0.375=0.015

• Example x1 = 1 x1 = 0 x - y x - y 11110- 1 00100 - 0 10010- 1 00000 - 0 11111- 1 10001- 1 10101 -1 10011 - 0 10111 - 0 10011 - 0

18 Intro to ML

Page 19: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Run of the algorithm

• Node x2

• Split the sample

• For x2=1 DONE !

• For x2=0 continue.

– For x3 (2,1) & (4,2) 0.5

– For x4 (3,1) & (3,2) 0.444

– For x5 (5,2) & (1,1) 0.4

• Select x5

• Example

x2 = 1 x2 = 0

x - y x - y

11110- 1 10010- 1

11111- 1 10001- 1

10101 -1

10011 - 0

10111 – 0

10011 - 0

19 Intro to ML

Page 20: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Run of the algorithm

• Node x5

• Split the sample

• For x5=0 DONE !

• For x5=1 continue.

– For x3 (2,1) & (3,1) 0.266

– For x4 (3,0) & (2,2) 0

• Select x4 DONE !!

• Example

x5 = 1 x5 = 0

x - y x - y

10001- 1 10010- 1

10101 -1

10011 - 0

10111 – 0

10011 - 0

20 Intro to ML

Page 21: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Resulting tree

Intro to ML 21

x1

0

1

x4

x2

1

1

0

0

x5

1

0

1

1

0

0

1

Page 22: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

DT: Performance

• DT size guarantee – Greedy does not have a DT size guarantee

• Consider f(x) = x1 + x2 mod 2 with d attributes

– Computing the smallest DT is NP-hard

• Boosting Analysis: – If we assume a weak learner (1/2 + g)

– Bound DT size • exp{O(1/g2 1/ e2 log2 1/e)} Gini/CART

• exp{O(1/g2 log2 1/e)} Entropy/C4.5

• exp{O(1/g2 log 1/e)} Variance

Intro to ML 22

Page 23: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Something to think about

• AdaBoost: very good bounds

– Grows like 1/g2

• DT : exponential bounds in 1/g2

• Comparable results in practice

• How can it be?

23 Intro to ML

Page 24: Decision Trees and Forest - ml-tau-2013.wdfiles.comml-tau-2013.wdfiles.com/.../week11-decision_trees_part1.pdf · •Random Forest Intro to ML 2 . Decision Trees - Boolean x 1 x 6

Decision Trees and Forests

• This week:

– Algorithms for constructing DT

• Greedy Algorithm

• Potential Function – upper bounds the error

• Next week:

– Pruning DT

– Ensemble Methods

• Random Forest

24 Intro to ML