Thursday, December 17, 2020

Natural Language Processing (NLP) Multiple Choice Questions with answers 18

Top 3 MCQ on NLP, NLP quiz questions with answers, NLP MCQ questions, Solved questions in natural language processing, NLP practitioner exam questions, Un-smootherd MLE, TFIDF


Multiple Choice Questions in NLP - SET 18

 

1. Which of the following smoothing techniques is most complex?

a) Add-1 smoothing

b) Add-k smoothing

c) Witten-Bell smoothing

d) Good-Turing smoothing

Answer: (d) Good-Turing smoothing

Good-Turing smoothing – The basic ideas is to use total frequency of events that occur only once to estimate how much mass to shift to unseen events. Use the count of things which are seen once to help estimate the count of things never seen.

Witten-Bell smoothing - The probability of seeing a zero-frequency N-gram can be modeled by the probability of seeing an N-gram for the first time.

 

2. Which of the following smoothing techniques assigns too much probability to unseen events?

a) Add-1 smoothing

b) Add-k smoothing

c) Witten-Bell smoothing

d) Good-Turing

Answer: (a) Add-1 smoothing

Add-1 smoothing assumes every (seen or unseen) event occurred once more than it did in the training data. Add-1 moves too much probability mass from seen to unseen events.

Add-one smoothing thinks we are extremely likely to see novel events, rather than words we’ve seen.

 

3. In add-k smoothing method, for a small k value, what would be perplexity?

a) High perplexity

b) Zero perplexity

c) Low perplexity

d) Perplexity is not disturbed

Answer: (a) High perplexity

In Add-k smoothing, when k is small, unseen words have very small probability. it causes high perplexity.

Perplexity - The perplexity of a language model on a test set is the inverse probability of the test set, normalized by the number of words. It is used for evaluating the language models.

 

*************



Top interview questions in NLP

NLP quiz questions with answers explained

Online NLP quiz with solutions

question and answers in natural language processing

Language model smoothing

How does k value affects perplexity in add-k smoothing

No comments:

Post a Comment

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents

data recovery