NPTEL Introduction to Machine Learning Week 11 Assignment Answers 2024
1. Which of the following is/are estimated by the Expectation Maximization (EM) algorithm for a Gaussian Mixture Model (GMM)?
- K (number of components)
- πk (mixing coefficient of each component)
- μk (mean vector of each component)
- Σk (covariance matrix of each component)
- None of the above
Answer :- b, c
4. Select the correct statement(s) about the EM algorithm for GMMs.
- In the mth iteration, the γ(znk) values are computed using the paramater estimates computed in the same iteration.
- In the mth iteration, the γ(znk) values are computed using the paramater estimates computed in the (m−1)th iteration.
- The Σk parameter estimates are computed during the E step.
- The πk parameter estimates are computed during the M step.
Answer :- b, d
5. For questions 5 to 7, use the following data consisting of 8 points (xi,yi).

Fit a GMM with 2 components for this data. What are the mixing coefficients of the learned components? (Note: Use the sklearn implementation of GMM with random state = 0. Do not change the other default parameters).
- (0.791, 0.209)
- (0.538, 0.462)
- (0.714, 0.286)
- (0.625, 0.375)
Answer :- d
6. Using the model trained in question 5, compute the log-likelihood of the following points. Which of these points has the highest likelihood of being sampled from the model?
- (2.0, 0.5)
- (-1.0, -0.5)
- (7.5, 8.0)
- (5.0, 5.5)
Answer :- d
7. Let Model A be the GMM with 2 components that was trained in question 5. Using the same data from question 5, estimate a GMM with 3 components (Model B). (Note: Use the sklearn implementation of GMM with random state = 0 and all the other default parameters.)
Select the pair(s) of points that have the same label in Model A but different labels in Model B.
- (1.0, 1.5) and (0.9, 1.6)
- (1.8, 1.2) and (0.9, 1.6)
- (7.8, 9.5) and (8.8, 7.5)
- (7.8, 9.5) and (7.6, 8.0)
- (8.2, 7.3) and (7.6, 8.0)
Answer :- a, c
8. Consider the following two statements.
Statement A: In a GMM with two or more components, the likelihood can attain arbitrarily high values.
Statement B: The likelihood increases monotonically with each iteration of EM.
- Both the statements are correct and Statement B is the correct explanation for Statement A.
- Both the statements are correct, but Statement B is not the correct explanation for Statement A.
- Statement A is correct and Statement B is incorrect.
- Statement A is incorrect and Statement B is correct.
- Both the statements are incorrect.
Answer :- b