Introduction to Machine Learning Week 4 NPTEL Assignment Answers 2025

NPTEL Introduction to Machine Learning Week 4 Assignment Answers 2024

1. For a two-class classification problem, we use an SVM classifier and obtain the following separating hyperplane. We have marked 4 instances of the training data. Identify the point which will have the most impact on the shape of the boundary on its removal.

  • 1
  • 2
  • 3
  • 4
Answer :- a

2. Consider a soft-margin SVM with a linear kernel and no slack variables, trained on n points. The number of support vectors returned is k. By adding one extra point to the dataset and retraining the classifier, what is the maximum possible number of support vectors that can be returned (tuning parameter C)?

  • k
  • n
  • n + 1
  • k + 1
Answer :- c

3. Consider the data set given below.

Claim: The PLA (Perceptron Learning Algorithm) can learn a classifier that achieves zero misclassification error on the training data. This claim is:

  • True
  • False
  • Depends on the initial weights
  • True, only if we normalize the feature vectors before applying PLA.
Answer :- b

4. Consider the following dataset:

(Note: x is the feature and y is the output)
Which of these is not a support vector when using a Support Vector Classifier with a polynomial kernel with degree = 3, C = 1, and gamma = 0.1?
(We recommend using sklearn to solve this question.)

  • 3
  • 1
  • 9
  • 10
Answer :- a

5. Which of the following is/are true about the Perceptron classifier?

  • It can learn a OR function
  • It can learn a AND function
  • The obtained separating hyperplane depends on the order in which the points are presented in the training process.
  • For a linearly separable problem, there exists some initialization of the weights which might lead to non-convergent cases.
Answer :- a, b, c

8. Suppose you have trained an SVM which is not performing well, and hence you have constructed more features from existing features for the model. Which of the following statements may be true?

  • We are lowering the bias.
  • We are lowering the variance.
  • We are increasing the bias.
  • We are increasing the variance.
Answer :- a, d