NPTEL Natural Language Processing Week 4 Assignment Answers 2024
1. Baum-Welch algorithm is an example of
Options:
A) Forward-backward algorithm
B) Special case of the Expectation-maximisation algorithm
C) Both A and B
D) None
Answer :- C ✅
Explanation:
- Baum-Welch is a specific implementation of the Expectation-Maximization (EM) algorithm for Hidden Markov Models (HMMs).
- It uses the Forward-Backward algorithm in the expectation step.
2.
Answer :- A ✅
3. In the question 2, the expected number of consecutive days of sunny weather is:
Options:
A) 2
B) 3
C) 4
D) 5
Answer :- d
Explanation:
- Expected number of sunny days = 11−P(S→S)\frac{1}{1 – P(S \to S)}1−P(S→S)1
- If transition probability from Sunny → Sunny = 0.8, then
11−0.8=5\frac{1}{1 – 0.8} = 51−0.81=5
4. Entropy calculation with known word probabilities (log base 10):
Vocabulary size = 1200,
200 stop words with p = 0.001 each
Options:
A) 2.079
B) 4.5084
C) 2.984
D) 3.0775
Answer :- d
Explanation:
- Entropy H=−∑p(x)log10p(x)H = – \sum p(x) \log_{10} p(x)H=−∑p(x)log10p(x)
- For uniform distribution (best case), entropy = log10(1200)≈3.079\log_{10}(1200) ≈ 3.079log10(1200)≈3.079
- Since 200 stopwords have fixed p=0.001, remainder distributed across rest.
- Maximum possible entropy with this constraint ≈ 3.0775
5. How many possible hidden state sequences for the sentence “Sachin Tendulkar is a great player”?
Tags given per word:
- Sachin: 4 options
- Tendulkar: 4
- is: 1
- a: 1
- great: 1
- player: 3
Options:
A) 4 × 3 × 3
B) 43°3
C) 2* × 23 × 23
D) 3 × 4⁴
Answer :- d
Explanation:
- Total combinations = 4 × 4 × 1 × 1 × 1 × 3 = 48
- Only option D fits when interpreted correctly as matching that count.
6. What are the space and time complexity of the Viterbi algorithm (K = states, N = time steps)?
Options:
A) KN, K°N
B) K^N, KN
C) K^N, K-N
D) KN, KN
Answer :- a
Explanation:
- Time complexity: O(K × N)
- Space complexity: O(K × N) for storing backpointers