Advanced Database Management System - Tutorials and Notes: Solved Questions and Answers in NLP 01

Search Engine

Please visit, subscribe and share 10 Minutes Lectures in Computer Science

Wednesday, 27 May 2020

Solved Questions and Answers in NLP 01

Natural language processing quiz questions with answers, NLP true false interview questions, NLP quiz questions for competitive exams



NLP TRUE/FALSE Quiz Questions - SET 01

1. HMM for POS tagging problem assumes words are independent from each other.

(a) TRUE                                          (b) FALSE

View Answer

Answer: (b) FALSE
HMM assumes the words are independent from each other given the tag sequence.

2. In machine translation, a parallel corpus is required to estimate the language model.
(a) TRUE                                          (b) FALSE

View Answer

Answer: (b) FALSE
The parallel corpus is used to estimate the translation probabilities.

3. Given a well-tuned unigram language model p(w|θ) trained on all the text books in the domain of “Natural Language Processing”, we can conclude that p(“Natural Language Processing”|θ) > p(“Language Processing Natural”|θ).
(a) TRUE                                          (b) FALSE

View Answer

Answer: (b) FALSE
In unigram language model we are not considering the word order. Hence, they should be equal.

4. Given a unigram language model and a bigram language model estimated on the same text collection without smoothing, perplexity of the unigram language model will be much larger than that of the bigram language model on this same training corpus.

(a) TRUE                                          (b) FALSE

View Answer

Answer: (a) TRUE
Perplexity measures how well a model “fits” the test data. It uses the probability that the model assigns to the test corpus. It normalizes for the number of words in the test corpus and takes the inverse.
The probability of individual words is more likely when compared with probability of bigrams.
The more information the n-gram gives us about the word sequence, the lower the perplexity (since as Eq. 3.15 showed, perplexity is related inversely to the likelihood of the test sequence according to the model).

5. To make it computationally feasible, Naive Bayes assumes that features are independent from each other.
(a) TRUE                                          (b) FALSE

View Answer

Answer: (b) FALSE
Naïve Bayes only assumes conditional independence.

*************************




Top interview questions in NLP

NLP quiz questions with answers explained

Bigram and trigram language models

Online NLP quiz with solutions

No comments:

Post a comment

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents