# Assignment #2: COMP4434 Big Data Analytics solution

\$30.00

Original Work ?

5/5 - (1 vote)

## Question 1 [10 marks]

(a). [5 point] Consider using linear regression for binary classification on the label {0, 1}.

Here, we use a linear model
βπ
(π₯) = π1π₯ + π0
and squared error loss πΏ =
1
2
(βπ
(π₯) β π¦)
2

. The threshold of the prediction is set as
0.5, which means the prediction result is 1 if βπ
(π₯) β₯ 0.5 and 0 if βπ
(π₯) < 0.5.

However, this loss has the problem that it penalizes confident correct predictions, i.e.,
βπ
(π₯) is larger than 1 or less than 0. Some students try to fix this problem by using an
absolute error loss πΏ = |βπ
(π₯) β π¦|.

The question is: Will it fix the problem? Please
answer the question and explain it. Furthermore, some other students try designing
another loss function as follows
πΏ = {
max(0, βπ
(π₯)), π¦ = 0
β― , π¦ = 1
.

Although it is not complete yet, if it is correct in principle, please complete it and explain
how it can fix the problem. Otherwise, please explain the reason.

(b). [5 point] Consider the logistic regression model βπ
(π₯) = π(π

ππ₯), trained using the
binary cross entropy loss function, where π(π§) =
1
1+πβπ§
is the sigmoid function.

Some
students try modifying the original sigmoid function into the following one
π(π§) =
π
βπ§
1+πβπ§
.

The model would still be trained using the binary cross entropy loss. How would the
model prediction rule, as well as the learnt model parameters π , differ from
2

## Question 2 [20 marks]

Consider using logistic regression for classification problems. Four 3-dimensional data
points (π₯1, π₯2, π₯3
)
π
and the corresponding labels π¦
i
are given as follows.

Data point π₯1 π₯2 π₯3 y
D1 -0.120 0.300 -0.010 1
D2 0.200 -0.030 -0.350 -1
D3 -0.370 0.250 0.070 -1
D4 -0.100 0.140 -0.520 1

The learning rate π is set as 0.2 and the initial parameter π[0] is set as [-0.09, 0, -0.19, –