ISyE 6412A Theoretical Statistics HW9 solution

$30.00

Original Work ?
Category: You will Instantly receive a download link for .ZIP solution file upon Payment

Description

5/5 - (1 vote)

1. (7.44). Let X1, . . . , Xn be iid N(θ, 1). Show that the best unbiased estimator of θ
2
is X¯ 2
n − (1/n).
Calculate its variance, and show that it is greater than the Cramer-Rao Lower Bound.

Hints: When compute the variance V ar(δ) = E(δ
2
) − [E(δ)]2
, you can write X¯
n = a + bZ with
Z ∼ N(0, 1) and suitably constants a, b and then use the fact that for Z ∼ N(0, 1), we have E(Z) =
0, E(Z
2
) = 1, E(Z
3
) = 0 and E(Z
4
) = 3.

2. (7.38). For each of the following distributions, let X1, . . . , Xn be a random sample. Is there a function
of θ, say g(θ), for which there exists an unbiased estimator whose variance attains the Cramer-Rao
Lower Bound? If so, find it. If not, show why not.
(a) fθ(x) = θxθ−1
, 0 < x < 1, θ > 0;
(b) fθ(x) = log θ
θ−1
θ
x
, 0 < x < 1, θ > 1.

3. (Modified by Problem 7.10). The random variables X1, · · · , Xn are iid with probability density
function [motivated from a “practical point of view” at the end of this problem]
fθ1,θ2
(x) =



θ
−θ1
2
θ1x
θ1−1
, if 0 < x ≤ θ2;
0, otherwise.
where θ1 > 0, θ2 > 0, and Ω will be completed specified later.

(a) Assume θ1 is known (positive) and Ω = {θ2 : θ2 > 0}. Find the MLE of θ2.

(b) Assume θ2 is known (positive) and Ω = {θ1 : 0 < θ1 < ∞}. Find the MLE of θ1.

(c) Show that the estimator in (a) is biased, but in case (b) the MLE of 1/θ1 is unbiased.
[Hints: −
R 1
0
x
α−1
(log x)dx = α
−2
; incidentally, the MLE of θ1 itself is biased.]

(d) Assume both θ1 and θ2 are unknown, and Ω = {(θ1, θ2) : 0 < θ1 < ∞, 0 < θ2 < ∞}.
i. Find a two-dimensional sufficient statistic for (θ1, θ2).
ii. Find the MLEs of θ1 and θ2.
iii. Find the MLE estimator of ϕ(θ1, θ2) = Pθ1,θ2
(X1 > 1).

iv. The length (in millimeters) of cuckoos’ eggs found in hedge sparrow nests can be modelled
with this distribution. For the data
22.0, 23.9, 20.9, 23.8, 25.0, 24.0, 21.7, 23.8, 22.8, 23.1, 23.1, 23.5, 23.0, 23.0,
Compute the value of MLE in parts (ii) and (iii).

[Model that could yield such a problem: There are iid random variables Yj , uniformly distributed from
0 to θ2. You send an observer out on each of n successive days to observe some Yj ’s. He does not record
the Yj ’s. Instead, knowing that “the maximum of the Yj ’s is sufficient and an MLE,” he decides to
observe a certain number, θ1, of the Yj ’s each day and computes the maximum of these θ1 observations.

He reports you the value of Xi
, the maximum he computes on the i-th day. Unfortunately, he forgets
to tell you the θ1 he used. Then the Xi has the density function stated for this problem, where we
have simplified matters by allowing θ1 to be any positive value instead of restricting it to integers.]

4. Recall that in problem 6.3 of our text (i.e., problem #6 of HW #7 with special cases in problem #5
of HW #7 and problem #2 of HW #8), X1, . . . , Xn are assumed to be a random sample from the pdf
f(x|µ, σ) = 1
σ
e
−(x−µ)/σ, µ ≤ x < ∞, 0 < σ < ∞.

In each of the following three scenarios, estimate the parameter(s) using both the maximum likelihood
estimator (MLE) and the best unbiased estimator:
(a) Assume that σ is known. Find both MLE and the best unbiased estimator of µ.
(b) Assume that µ is known. Find both MLE and the best unbiased estimator of σ.
(c) Assume that both µ and σ are unknown. Find both MLE and the best unbiased estimator of µ
and σ.

5. (Modified from Problem 7.9). Let X1, . . . , Xn be iid with pdf
fθ(x) = 1
θ
, 0 ≤ x ≤ θ, θ > 0.
(a) Estimate θ using both the method of moments and maximum likelihood.
(b) Calculate the means and variances of the two estimators in part (a). Which one should be preferred
and why?

(c) One can improve the MLE θbMLE to an unbiased estimator of the form δc = cθbMLE. Find a constant
c such that Eθ(δc) = θ, i.e., δc = cθbMLE is an unbiased estimator of θ. Is it the best unbiased
estimator of θ?

(d) The best estimator of the form of δc = cθbMLE is the one that uniformly minimizes the risk function
Rδc
(θ) = Eθ(δc − θ)
2
. Find such constant c.

6. (This is to show that sometimes MLE has poor performance). Suppose that X1, . . . , Xn are
iid with density
fθ(x) =




2
(x+θ)
3 , if x > 0;
0, if x ≤ 0.
where Ω = {θ : θ > 0}.

(a) If n = 1, show that an MLE estimator of θ is θba = 2X1.
(b) Show that θba in part (a) is not an unbiased estimator of θ.
[Verify or believe: R ∞
0
x
(x+1)3 dx =
R ∞
1
u−1
u3 du =
1
2 with u = x + 1.]

(c) Under the squared error loss function L(θ, d) = (θ − d)
2
, show that θba in part (a) is much worse
than the constant estimator θb∗ ≡ 17. [Hints:
R ∞
0
x
2
(1+x)
3 dx = +∞.]
(d) If n = 2, show that an MLE of θ is θbb =
1
4
[X1 + X2 +
p
X2
1 + 34X1X2 + X2
2
].
[If you want, you can consider the general n by yourself. For general n, describe the computation of
the MLE in terms of solving a polynomial equation of some degree, checking whether a local maximum
is a global maximum, etc.]