## Description

From Chapter 6 page 261 (Use R or Rstudio)

Q5. It is well-known that ridge regression tends to give similar coefficient values to correlated

variables, whereas the lasso may give quite different coefficient values to correlated variables.

We will now explore this property in a very simple setting. Suppose that � = 2, � = 2, �” =

�'(, �(‘ = �((. Furthermore, suppose that �’ + �( = 0 and �” + �(‘ = 0 and �'( + �(( = 0 ,

so that the estimate for the intercept in a least squares, ridge regression, or lasso model is zero:

�-

. = 0 .

(a) (3 marks) Show that the ridge regression optimization problem in this setting (or the quantity

in equation 6.5 in Chapter 6 in this setting) is 2(�’ − (�’ + �()�”)( + �(�’

( + �(

(

).

(b) (5 marks) Show that in the setting (a), the ridge coefficient estimates satisfy �-

‘ = �-

( .

(c) (3 marks) Show that the lasso regression optimization problem in this setting (or the quantity

in equation 6.7 in Chapter 6 in this setting) is 2(�’ − (�’ + �()�”)( + �(|�’| + |�(|).

(d) (5 marks) Show that in the setting (c), the lasso coefficients �-

‘and �-

‘are not unique—in

other words, there are many possible solutions to the optimization problem.

Q8. In this exercise, we will generate simulated data, and will then use this data to perform best

model selection. Use the �����() function to generate a predictor � of length � = 100 , as well

as a noise vector � of length n = 100 such that � = 0.1 * �����(�)

(a) (1 mark) Generate (use set.seed(19)) a response vector � of length � = 100 according to the

model

� = �. + �’� + �(�( + �;�; + �

where �., �’, �(, and �; are constants as �. = 1.0, �’ = −0.1, �( = 0.05, �; = 0.75

(b) Use the regsubsets() function to perform best subset selection in order to choose the best

model containing the predictors �, �(, �;, … , �A using the measures ��, ���, �������� ��

(i) (6 marks) Plot each measure against number of predictors on the same page using

par(mfrow=c(2,2)).

(ii) (3 marks) Give the best model coefficients obtained from each

�P, ���, �������� �(.

Note:

1. You will need to use the data.frame() function to create a single data set containing both

X and Y.

(c) Now fit a ridge regression model to the simulated data, again using �, �(, �;, … , �A as

predictors.

(i) (2 marks) Plot the extracted coefficients as a function of log(λ) with a legend

containing each curve colour and its predictor name at the top-right corner.

(ii) (4 marks) Plot the cross-validation (set.seed(20)) error as a function of log(λ) to find

the optimal λ .

(iii) (1 mark) Give coefficient estimates for the optimal value of λ.

(d) Now fit a lasso model to the simulated data, again using �, �(, �;, … , �A as predictors.

(i) (2 marks) Plot the extracted coefficients as a function of log(λ) with a legend

containing each curve colour and its predictor name at the top-right corner.

(ii) (4 marks) Plot the cross-validation (set.seed(21)) error as a function of log(λ) to find

the optimal λ .

(iii) (1 mark) Give coefficient estimates for the optimal value of λ.

Note:

1. Use cv.glmnet() to do the cross-validation and use the default of 10-fold cross-validation.