Please visit, subscribe and share 10 Minutes Lectures in Computer Science

## Multiple Choice Questions (MCQ) in Natural Language Processing (NLP) with answers

1. In a HMM, the possible state transitions are from state JJ to states NN, VB, JJ and RB. Following are the known state transitions probabilities;
P(NN|JJ) = 1/4. P(VB|JJ) = 1/6, and P(JJ|JJ) = 1/3.
What is the transition probability value of P(RB|JJ)?
a) 1/4
b) 1/2
c) 1/5
d) 1/3
 Answer: (a) The sum of all outgoing links from any state to other states should be 1. In our question, the possible state transitions from JJ are 4.

2. Suppose we want to calculate a probability for the sequence of observations {‘Dry’,’Rain’}. If the following are the possible hidden state sequences, then P(‘Dry’ ‘Rain’) = ---------.
{‘Low’, ‘Low’}, {‘Low’, ‘High’}, {‘High’, ‘Low’}, and {‘High’, ‘High’}

a) P(‘Dry’ ‘Rain’, ‘Low’ ‘Low’) * P(‘Dry’ ‘Rain’, ‘Low’ ‘High’) * P(‘Dry’ ‘Rain’, ‘High’ ‘Low’) * P(‘Dry’ ‘Rain’, ‘HIgh’ ‘High’)
b) P(‘Dry’ ‘Rain’, ‘Low’ ‘Low’) + P(‘Dry’ ‘Rain’, ‘Low’ ‘High’) + P(‘Dry’ ‘Rain’, ‘High’ ‘Low’) + P(‘Dry’ ‘Rain’, ‘HIgh’ ‘High’)
c) P(‘Dry’ ‘Rain’) * P( ‘Low’ ‘High’) * P( ‘High’ ‘Low’) * P(‘HIgh’ ‘High’)
d) P(‘Dry’ ‘Rain’) + P( ‘Low’ ‘High’) + P( ‘High’ ‘Low’) + P(‘HIgh’ ‘High’)
 Answer: (b) We are given the observation sequence {‘Dry’, ‘Rain’}. For computing the probability of observation sequence given no particular hidden state sequence, then we need to compute the joint probability of observation and all possible hidden state sequences and sum all these conditional probabilities to calculate P(‘Dry’ ‘Rain).

3. [This question is in continuation with the previous one.] Which of the following best describes the probability of observation sequence {‘Dry’,’Rain’} given a hidden state ‘Low’ for the observation ‘Dry’?
a) P(‘Dry’ ‘Rain’, ‘Low’ ‘Low’) + P(‘Dry’ ‘Rain’, ‘Low’ ‘High’) + P(‘Dry’ ‘Rain’, ‘High’ ‘Low’) + P(‘Dry’ ‘Rain’, ‘HIgh’ ‘High’)
b) P(‘Dry’ ‘Rain’, ‘Low’ ‘Low’) * P(‘Dry’ ‘Rain’, ‘Low’ ‘High’) * P(‘Dry’ ‘Rain’, ‘High’ ‘Low’) * P(‘Dry’ ‘Rain’, ‘HIgh’ ‘High’)
c) P(‘Dry’ ‘Rain’, ‘Low’ ‘Low’) * P(‘Dry’ ‘Rain’, ‘Low’ ‘High’)
d) P(‘Dry’ ‘Rain’, ‘Low’ ‘Low’) + P(‘Dry’ ‘Rain’, ‘Low’ ‘High’)
 Answer: (d) If the observation sequence given without any particular state sequence, we need to compute the joint probabilities of the observation for all possible hidden state sequences and sum all these conditional probabilities. In this question, one of the hidden states was given as ‘Low’. Hence, there are only two possibilities ‘Low’ or ‘High’ for the observation ‘Rain’. Hence the answer is (d).

4. _________ is the type of morphology that changes the word category and affects the meaning.
a) Inflectional
b) Derivational
c) Cliticization
d) All of the above
 Answer: (b) Derivation creates different words from the same lemma. The new words formed through derivational morphology may be a stem for another affix and usually have different meaning than the stem. On the other hand, inflectional morphology usually does not change the POS category or the word meaning.

5. computer vs computational is an example of ______ morphology.
a) Inflectional
b) Derivational
c) Cliticization
d) None of the above
 Answer: (b) It is derivational morphology because both are different words under different categories (computer – noun, computational – adjective) and with different meanings.

*************