Homework 3 solution

\$29.99

Original Work ?

Description

This homework contains 4 questions. The last two questions require programming. The maximum
number of points is 100 plus 10 bonus points.
1 Question 1 – Boosting (20 points)
We learned about boosting in lecture and the topic is covered in Murphy 16.4. On page 555 Murphy claims
that “it was proved that one could boost the performance (on the training set) of any weak learner arbitrarily
high, provided the weak learned could always perform slightly better than chance.” We will now verify this
1. (5 points) Given a set of N observations (x
j
, yj
) where y
j
is the label y
j ∈ {−1, 1}, let ht(x) be the
weak classifier at step t and let αt be its weight. First we note that the final classi
3. (10 points) We showed above that training error is bounded above by QT
t=1 Zt
. At step t the values
Z1, Z2, . . . , Zt−1 are already fixed therefore at step t we can choose αt
to minimize Zt
. Let
t =
X
N
j=1
w
t
j
δ(ht(x
j
) 6= y
j
)
be the weighted training error for weak classifier ht(x) then we can re-write the formula for Zt as:
Zt = (1 − t) exp(−αt) + t exp(αt)
(a) First find the value of αt
that minimizes Zt
then show that
Z
opt
t = 2p
t(1 − t)
(b) Assume we choose Zt
this way. Then re-write t =
1
2 − γt where γt 0 implies better than