Introduction to Machine Learning Week 4 NPTEL Assignment Answers 2025

Need help with this week’s assignment? Get detailed and trusted solutions for Introduction to Machine Learning Week 4 NPTEL Assignment Answers. Our expert-curated answers help you solve your assignments faster while deepening your conceptual clarity.

✅ Subject: Introduction to Machine Learning (nptel ml Answers)
📅 Week: 4
🎯 Session: NPTEL 2025 July-October
🔗 Course Link: Click Here
🔍 Reliability: Verified and expert-reviewed answers
📌 Trusted By: 5000+ Students

For complete and in-depth solutions to all weekly assignments, check out 👉 NPTEL Introduction to Machine Learning Week 4 Assignment Answers

🚀 Stay ahead in your NPTEL journey with fresh, updated solutions every week!

NPTEL Introduction to Machine Learning Week 4 Assignment Answers 2025

1.

  • The gradient of the hyperplane
  • The signed distance to the hyperplane
  • The normal vector to the hyperplane
  • The misclassification error
Answer : See Answers

2. Why do we normalize by ∥β∥ (the magnitude of the weight vector) in the SVM objective function?

  • To ensure the margin is independent of the scale of β
  • To minimize the computational complexity of the algorithm
  • To prevent overfitting
  • To ensure the bias term is always positive
Answer :

3. Which of the following is NOT one of the KKT conditions for optimization problems with inequality constraints?

Answer :

4. Consider the 1 dimensional dataset:

Answer :

5. Consider a polynomial kernel of degree d operating on p-dimensional input vectors. What is the dimension of the feature space induced by this kernel?

  • p×d
  • (p+1)×d
  • (p+dd)
  • pd
Answer :

6. State True or False: For any given linearly separable data, for any initialization, both SVM and Perceptron will converge to the same solution

  • True
  • False
Answer :

7. Train a Linear perceptron classifier on the modified iris dataset. We recommend using sklearn. Use only the first two features for your model and report the best classification accuracy for l1 and l2 penalty terms.

  • 0.91, 0.64
  • 0.88, 0.71
  • 0.71, 0.65
  • 0.78, 0.64
Answer :

8. Train a SVM classifier on the modified iris dataset. We recommend using sklearn. Use only the first three features. We encourage you to explore the impact of varying different hyperparameters of the model. Specifically try different kernels and the associated hyperparameters. As part of the assignment train models with the following set of hyperparameters RBF-kernel, gamma=0.5, one-vs-rest classifier, no-feature-normalization. Try C=0.01,1,10. For the above set of hyperparameters, report the best classification accuracy.

  • 0.98
  • 0.88
  • 0.99
  • 0.92
Answer : See Answers

NPTEL Introduction to Machine Learning Week 4 Assignment Answers 2024

1. For a two-class classification problem, we use an SVM classifier and obtain the following separating hyperplane. We have marked 4 instances of the training data. Identify the point which will have the most impact on the shape of the boundary on its removal.

  • 1
  • 2
  • 3
  • 4
Answer :- a

2. Consider a soft-margin SVM with a linear kernel and no slack variables, trained on n points. The number of support vectors returned is k. By adding one extra point to the dataset and retraining the classifier, what is the maximum possible number of support vectors that can be returned (tuning parameter C)?

  • k
  • n
  • n + 1
  • k + 1
Answer :- c

3. Consider the data set given below.

Claim: The PLA (Perceptron Learning Algorithm) can learn a classifier that achieves zero misclassification error on the training data. This claim is:

  • True
  • False
  • Depends on the initial weights
  • True, only if we normalize the feature vectors before applying PLA.
Answer :- b

4. Consider the following dataset:

(Note: x is the feature and y is the output)
Which of these is not a support vector when using a Support Vector Classifier with a polynomial kernel with degree = 3, C = 1, and gamma = 0.1?
(We recommend using sklearn to solve this question.)

  • 3
  • 1
  • 9
  • 10
Answer :- a

5. Which of the following is/are true about the Perceptron classifier?

  • It can learn a OR function
  • It can learn a AND function
  • The obtained separating hyperplane depends on the order in which the points are presented in the training process.
  • For a linearly separable problem, there exists some initialization of the weights which might lead to non-convergent cases.
Answer :- a, b, c

8. Suppose you have trained an SVM which is not performing well, and hence you have constructed more features from existing features for the model. Which of the following statements may be true?

  • We are lowering the bias.
  • We are lowering the variance.
  • We are increasing the bias.
  • We are increasing the variance.
Answer :- a, d