Showing posts with label NLP Quiz Questions. Show all posts
Showing posts with label NLP Quiz Questions. Show all posts

Sunday, October 25, 2020

Natural Language Processing (NLP) Multiple Choice Questions with answers 16

Top 5 MCQ on NLP, NLP quiz questions with answers, NLP MCQ questions, Solved questions in natural language processing, NLP practitioner exam questions, Add-1 smoothing, MLE, inverse document frequency


Multiple Choice Questions in NLP

 

1. Let us assume that we use the words ‘study’ ‘computer’ and ‘abroad’. It these are only informative words to classify that a mail is spam or not. Which of the following represent the maximum-likelihood estimate using add-one smoothing for P(study|spam)? Use the following table to answer the question;

‘study’

‘computer’

‘abroad’

Class

1

0

1

1

0

0

0

0

0

0

0

1

0

1

0

0

0

1

0

0

0

1

0

0

0

0

1

0

0

0

Not spam

Not spam

Not spam

Not spam

Spam

Spam

Spam

Spam

Spam

Spam

a) 0/6

b) 0/8

c) 1/6

d) 1/8

Click here to view answer


 

2. What is the probability P (‘computer in abroad’ | spam) as per the data in the table given in question 1?

a) 1/6

b) 2/6

c) 1/36

d) 1/18

Click here to view answer


 

3. What is the unsmoothed maximum likelihood estimate of P(Spam) for the data given in question 1?

a) 1

b) 6/10

c) 4/6

d) 3/5

Click here to view answer


 

4. Which of the following increases the weight of rarely occurring terms in the document set?

a) Term frequency

b) Word frequency

c) Inverse document frequency

d) Bi-gram frequency

Click here to view answer


 

5. The act of converting a text document into a set of individual words is referred as ______ .

a) Tokenization

b) Stemming

c) Lemmatization

d) All of the above

Click here to view answer


 

*************




Top interview questions in NLP

NLP quiz questions with answers explained

Online NLP quiz with solutions

question and answers in natural language processing

unsmoothed maximum likelihood estimation

What is inverse document frequency

how inverse document frequency helps in tf-idf calculation

Top 5 important questions with answers in natural language processing

 

 

Saturday, October 17, 2020

Explain add-1 (Laplace) smoothing with an example

Natural language processing keywords, what is add-1 smoothing, what is Laplace smoothing, explain add-1 smoothing with an example, unigram and bi-gram with add-1 laplace smoothing

Add-1 (Laplace) smoothing

We have used Maximum Likelihood Estimation (MLE) for training the parameters of an N-gram model. The problem with MLE is that it assigns zero probability to unknown (unseen) words. This is because, MLE uses a training corpus. If the word in the test set is not available in the training set, then the count of that particular word is zero and it leads to zero probability.

To eliminate this zero probability, we can do smoothing. Smoothing is about taking some probability mass from the events seen in training and assigns it to unseen events. Add-1 smoothing (also called as Laplace smoothing) is a simple smoothing technique that Add 1 to the count of all n-grams in the training set before normalizing into probabilities.

Example:

Recall that the unigram and bi-gram probabilities for a word w are calculated as follows;

P(w) = C(w)/N

P(wn|wn-1) = C(wn-1 wn)/C(wn-1)

Where, P(w) is the unigram probability, P(wn-1 wn) is the bigram probability, C(w) is the count of occurrence of w in the training set, C(wn-1 wn) is the count of bigram (wn-1 wn) in the training set, N is the total number of word tokens in the training set.

Add-1 smoothing for unigrams

PLaplace(w) = (C(w)+1)/N+|V|

Here, N is the total number of tokens in the training set and |V| is the size of the vocabulary represents the unique set of words in the training set.

As we have added 1 to the numerator, we have to normalize that by adding the count of unique words with the denominator in order to normalize.

Add-1 smoothing for bigrams

PLaplace(wn|wn-1) = (C(wn-1 wn)+1)/C(wn-1)+|V|

 

*************************

 

Related articles:

 

 

Explain add-1 smoothing

What is laplace smoothing

How to apply laplace smoothing in NLP for smoothing

Unigram and bigram probability calculations with add-1 smoothing

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents

data recovery