ECE 421 Homework Problems – Tutorial #2 Theme: Perceptron Learning Algorithm and Linear Regression solution

$24.99

Original Work ?
Category: You will Instantly receive a download link for .ZIP solution file upon Payment

Description

5/5 - (5 votes)

Question 1 (Perceptron Learning Algorithm)
Given a dataset D = {(xn, yn)}
N
n=1, where xn ∈ R
d
and yn ∈ {+1, −1}, we wish to train a
perceptron model
h(x) = sign(b +
Pd
i=1 wixi) = sign(w
>x), (1)
that correctly classifies all examples in D. Consider the perceptron weight update rule (1.3) on
Page 7 of LFD, i.e.,
w(t + 1) = w(t) + y(t)x(t). (2)
This weight update rule moves the weights in the direction of classifying examples correctly. To see
this, show the following.
(a) If x(t) is misclassified by w(t), show that y(t)w
>(t)x(t) < 0. (b) Use equation for w(t + 1) to show that y(t)w(t + 1)x(t) > y(t)w(t)
>x(t).
(c) Argue that the weight update from w(t) to w(t + 1) is a move “in the right direction”.
Remark: Problem 1.3 on page 33 shows the steps toward a rigorous proof of the convergence of the
perceptron algorithm. You may wish to look ahead and see if you can partially solve this problem
on your own. The solution will be explained in detail in tutorials.
1