## Description

1. Consider the Bayesian model

y|θ1, θ2 ∼ N(θ1 + θ2, 1),

θi ∼iid N(0, 1),

for i = 1, 2. Suppose y = 1 is observed. Then, find the marginal posterior distributions

of θ1 and θ2. (Hint: (i) regression. (ii) If In is an n × n identity matrix and Jn is an

n × n matrix of 1’s, then (In + bJn)

−1 = In −

b

1+nbJn. (iii) If θ follows a multivariate

normal distribution, then the marginal distribution of θi

is a normal distribution with

the corresponding mean and variance).

2. Consider the coin example discussed in the class and perform the following simulation.

Simulate the weights of 10 coins from θi ∼ N(5.67, .012

) for i = 1, · · · , 10. Simulate

10 measurements from yi

|θi ∼ N(θi

, .022

). Compute the total error sum of squares

SSEEB =

P10

i=1(θi − ˆθ

EB

i

)

2 and SSEMLE =

P10

i=1(θi − yi)

2

. Repeat this 1000 times

and plot the densities of the two quantities SSEEB and SSEMLE, and make comments.

(Include your R code with the solutions).

3. Let

yi

|θi ∼ind P oisson(θi)

θi ∼iid Exp(λ)

for i = 1, . . . , n. (p(θ) = λe−λθ). Find the empirical Bayes estimator of θi

, i = 1, . . . , n.

4. Consider the Bayesian model:

xi

|φi ∼ind N(0, φi)

1

φi

∼iid Exp(λ),

for i = 1, · · · , n. Find the empirical Bayes estimator of φi

, i = 1, · · · , n. Evaluate the

expressions as far as possible.