EECS 349 Homework 6 solution

$24.99

Original Work ?
Category: You will Instantly receive a download link for .ZIP solution file upon Payment

Description

5/5 - (1 vote)

Building a GMM classifier Load “gmm_test.csv” and “gmm_train.csv”. This contains two labeled samples of data: X_test and X_train. The labels for each sample are contained in Y_test and Y_train. So Y_train(i) contains the label for X_train(i). Each of these samples was drawn from data coming from two different underlying populations, labeled as classes “1” and “2”. In this homework, you will model each class in the training data, using a GMM for each class. You will then design a classifier to classify X_test using your GMMs. You will evaluate your classification results on the test data. You need to submit your source code of “gmmest.py” and “gmmclassify.py” as described below. Note: You are welcome to make your functions work for multivariate (a.k.a. multidimensional) GMMs, but you are not required to do so. 1. (2 points) Implement the EM algorithm to estimate parameters (means, variances, weights) of a 1-dimensional GMM. We have provided starter code in gmm_est.py that looks like this: def gmm_est(X, mu_init, sigmasq_init, wt_init, its): “”” Input Parameters: – X : N 1-dimensional data points (a 1-by-N np array) – mu_init : initial means of K Gaussian components (a 1-by-K np array) – sigmasq_init: init variances of K Gaussian components (a 1-by-K np array) – wt_init : init weights of k Gaussian components (a 1-by-K np array) (sums to 1) – its : number of iterations for the EM algorithm Returns: – mu : means of Gaussian components (a 1-by-K np array) – sigmasq : variances of Gaussian components (a 1-by-K np array) – wt : weights of Gaussian components (a 1-by-K np array, sums to 1) – L : log likelihood “”” At this point your code in this part should not output anything, it should just be the implementation of the EM algorithm. In part 2, you will use the gmm_est() function that you have written. 2. (2 points) Test your function by building a mixture model for each of the two classes in X_train. Choose an appropriate number of Gaussian components and initializations of parameters. In your write up, include a plot the data log-likelihood values for the first 20 iterations. Do you think your program has converged? In your write up, report the final values of the GMM parameters for class 1 and for class 2.
EECS 349 (Machine Learning) Homework 6
Your program should be runnable from the command line with the following command: python gmm_est.py