Looking for accurate and well-explained NPTEL assignment solutions for Natural Language Processing – Week 2? You’re in the right place! Below you’ll find 100% reliable answers with explanations to boost your understanding and help you perform confidently in your coursework.
Nptel Natural Language Processing Week 2 Assignment Answers 2025
1. According to Zipf’s law which statement(s) is/are correct?
(i) A small number of words occur with high frequency.
(ii) A large number of words occur with low frequency.
Options:
a. Both (i) and (ii) are correct
b. Only (ii) is correct
c. Only (i) is correct
d. Neither (i) nor (ii) is correct
Answer: ✅ a
Explanation:
Zipf’s Law states that in a corpus, a few words appear frequently, while many appear rarely. Hence, both (i) and (ii) are correct.
2. Total count of unique bi-grams in the given corpus?
Corpus:
- today is Sneha’s birthday
- she likes ice cream
- she is also fond of cream cake
- we will celebrate her birthday with ice cream cake
Options:
a. 24
b. 28
c. 27
d. 23
Answer: ✅ a
Explanation:
By listing all bi-grams and removing duplicates, we get 24 unique bi-grams.
3. A 3-gram model is a ___________ order Markov Model.
Options:
a. Two
b. Five
c. Four
d. Three
Answer: ✅ a
Explanation:
A 3-gram model depends on the previous two words, hence it’s a second-order Markov model.
📚 Want Full NPTEL Assignment Support?
If you’re looking for expert-curated answers with explanations and weekly updates,
👉 Click here to visit Answer GPT – your trusted NPTEL learning partner.
4. Which of these is/are valid Markov assumptions?
Options:
a. Probability of a word depends only on the current word.
b. Probability depends only on the previous word.
c. Probability depends only on the next word.
d. Probability depends only on the current and previous word.
Answer: ✅ a, c, d
Explanation:
Simplified Markov assumptions reduce complexity by limiting dependency to adjacent words.
5. For the word ‘mash’, which strings have Levenshtein distance 1?
Options:
a. smash, mas, lash, mushy, hash
b. bash, stash, lush, flash, dash
c. smash, mas, lash, mush, ash
d. None of the above
Answer: ✅ c
Explanation:
Levenshtein distance 1 = one change (insert/delete/replace). All options in (c) are just one operation away from ‘mash’.
6.
Answer: ✅ d
7.
Answer: ✅ a
8. Calculate P(they play in a big garden) using a bi-gram model.
Options:
a. 1/8
b. 1/12
c. 1/24
d. None of the above
Answer: ✅ b
Explanation:
Each pair is evaluated based on previous word; the result from bi-gram chain = 1/12.
9. Calculate perplexity of: <s> they play in a big garden <s>
Options:
a. 2.289
b. 1.426
c. 1.574
d. 2.178
Answer: ✅ b
Explanation:
Perplexity = inverse probability normalized by sequence length. Bi-gram model gives 1.426.
10. Using add-one smoothing, calculate P(they play in a beautiful garden).
Options:
a. 4.472 × 10⁻⁶
b. 2.236 × 10⁻⁶
c. 3.135 × 10⁻⁶
d. None of the above
Answer: ✅ b
Explanation:
Add-one smoothing adjusts low-frequency probabilities. Final product is 2.236 × 10⁻⁶.
✅ Conclusion:
These well-explained answers for NLP Week 2 will help clarify your concepts around bi-grams, Markov models, Levenshtein distance, smoothing techniques, and perplexity.
👉 For all weeks and subjects, visit the Answer GPT homepage for expert-verified NPTEL answers and support.