Sale!

SI251 – Convex Optimization homework 4 solved

$30.00 $18.00

Original Work ?

Download Details:

  • Name: homework4-okwqse.zip
  • Type: zip
  • Size: 4.01 MB

Category: Tags: , , You will Instantly receive a download link upon Payment||Click Original Work Button for Custom work

Description

5/5 - (1 vote)

1 Proximal Operator

For each of the following convex functions, compute the proximal operator proxf
.
(1) (10 pts) f(x) = λ∥x∥1, where x ∈ R
d and λ ∈ R+ is the regularization parameter.

(2) (20 pts) f(X) = λ∥X∥∗, where X ∈ R
d×m is a matrix, ∥X∥∗ denotes the nuclear norm,
and λ ∈ R+ is the regularization parameter.
2 Alternating Direction Method of Multipliers
(35 pts) Consider the following problem.
minimize − log det X + Tr(XC) + ρ∥X∥1
subject to X ⪰ 0
(1)

In (1), ∥ · ∥1 is the entrywise ℓ1-norm. This problem arises in estimation of sparse undirected
graphical models. C is the empirical covariance matrix of the observed data. The goal is to
estimate a covariance matrix with sparse inverse for the observed data. In order to apply ADMM
we rewrite (1) as
minimize − log det X + Tr(XC) + IX⪰0(X) + ρ∥Y ∥1
subject to X = Y
(2)
where IX⪰0(·) is the indicator function associated with the set X ⪰ 0. Please provide the ADMM
update (the derivation process is required) for each variable at the t-th iteration.

3 Monotone Operators and Base Splitting Schemes
(35 pts) Proof the theorem below:
Theorem 1. For v ∈ R
n, the solution of the equation
u
∗ = (I − JW)
−T
v (3)
is given by
u
∗ = v + WT u˜

(4)
where I is the identity matrix and u˜

is a zero of the operator splitting problem 0 ∈ (F + G)(u

),
with operators defined as
F(˜u) = (I − WT
)(˜u), G(˜u) = Du˜ − v (5)
where D is a diagonal matrix defined by J = (I + D)
−1
(where Jii > 0).

(Hint-1, please refer to Monotone Operators-note.pdf)
(Hint-2, I = (I − JW)
−T
(I − JW)
T
)