Sale!

Probability & Statistics for EECS Homework 09 solved

$30.00 $18.00

Original Work ?

Download Details:

  • Name: hw9-ugj2pe.zip
  • Type: zip
  • Size: 566.37 KB

Category: Tags: , , You will Instantly receive a download link upon Payment||Click Original Work Button for Custom work

Description

5/5 - (1 vote)

1. Show the proof of general Bayes’ Rule (four cases).
2. Let X and Y be i.i.d. Geom(p), and N = X + Y .
(a) Find the joint PMF of X, Y, N.
(b) Find the joint PMF of X and N.
(c) Find the conditional PMF of X given N = n, and give a simple description in
words of what the result says.

3. Let X ∼ Expo(λ), and let c be a positive constant.
(a) If you remember the memoryless property, you already know that the conditional
distribution of X given X > c is the same as the distribution of c + X (think
of waiting c minutes for a “success” and then having a fresh Expo(λ) additional
waiting time). Derive this in another way, by finding the conditional CDF of X
given X > c and the conditional PDF of X given X > c.
(b) Find the conditional CDF and conditional PDF of X given X < c.

4. Let U1, U2, U3 be i.i.d. Unif(0, 1), and let L = min(U1, U2, U3), M = max(U1, U2, U3).
(a) Find the marginal CDF and marginal PDF of M, and the joint CDF and joint
PDF of L, M.
(b) Find the conditional PDF of M given L.

5. Let X and Y be i.i.d. Geom(p), L = min(X, Y ), and M = max(X, Y ).
(a) Find the joint PMF of L and M. Are they independent?

(b) Find the marginal distribution of L in two ways: using the joint PMF, and using
a story.

(c) Find E[M].

(d) Find the joint PMF of L and M − L. Are they independent?