Showing posts with label NLP Question Bank. Show all posts
Showing posts with label NLP Question Bank. Show all posts

Sunday, September 27, 2020

Natural Language Processing (NLP) Multiple Choice Questions with answers 15

Top 5 MCQ on NLP, NLP quiz questions with answers, NLP MCQ questions, Solved questions in natural language processing, NLP practitioner exam questions


Multiple Choice Questions in NLP

1. In a language, it is usual to have a word with more than one meaning even within its semantic class (polysemy). Which of the following tasks would help us in choose the right meaning as per the context in which the word is used?

(a) Lemmatization

(b) Word Sense Disambiguation

(c) Stemming

(d) Discourse analysis

Answer: (b) Word Sense Disambiguation

Word Sense Disambiguation (WSD)

The ability to computationally determine which sense of a word is activated by its use in a particular context.

 

Polysemy

Polysemy is the capacity for a word or phrase to have multiple meanings, usually related by contiguity of meaning within a semantic field. For example, chicken as a noun can be used in the following senses;

Sense 1: Chicken (noun) – meat of the chicken

Sense 2: Chicken (noun) – a bird

 

2. Which of the following is an advantage of normalizing a word?

(a) It helps in reducing the randomness in the word

(b) It increases the false negatives

(c) It reduces the dimensionality of the input.

(d) All of the above

Answer: (a) and (c)

(a) It helps in reducing the randomness in the word

When we normalize a text using any normalization technique, we actually reduce the word into its base form. A word may be used in different tenses according to the grammar. For example, working, worked, and works are all refer to the same root word ‘work’.  Hence, converting these words into root reduces three different occurrence of a word into one. This helps a lot in NLP.

 

(c) It reduces the dimensionality of the input

As mentioned above, it reduces the number of unique words extracted from a corpus. In this way, it helps in reducing the dimension in machine learning task.

 

3. Which of the following techniques is most appropriate to the process of word normalization?

(a) Lemmatization

(b) Stemming

(c) Stop word removal

(d) Rooting

Answer: (a) Lemmatization

Lemmatization

A word in a language can be inflected into different word to express a grammatical function or attributes. For example,

‘glasses’ is the inflected form of the word ‘glass’ to denote the plural noun.

‘opened, ‘opening’, ‘opens’ are the inflected form of the word ‘open’ to denote the grammatical variation

Lemmatization is the process of removing the inflections of a word in order to map the word to its root form. Example: ‘opened’ to ‘open.

 

Stemming is also does the same work but it uses heuristics to cut down the inflections from a word. This may not always end up in root word. For example, stemming process reduces the word ‘dogs’ to ‘dog’ but reduces the word ‘tried’ to ‘tri’. Hence, it is not most appropriate.

 

4. Words may have multiple meanings. This leads to what type of ambiguity in NLP?

(a) Syntactic ambiguity

(b) Anaphoric ambiguity

(c) Semantic ambiguity

(d) Lexical ambiguity

Answer: (d) Lexical ambiguity

Ambiguity in NLP is the state of being ambiguous usually with more than one interpretation of a word, phrase or sentence.

Lexical ambiguity is a type of ambiguity that occurs when a sentence contains a word that has more than one meaning.

 

5. “I went to the school, and they told me come on next day”. What type of ambiguity present in the given sentence?

(a) Syntactic ambiguity

(b) Anaphoric ambiguity

(c) Semantic ambiguity

(d) Lexical ambiguity

Answer: (b) Anaphoric ambiguity

Anaphoric ambiguity

Anaphoric ambiguity refers to such a situation where an anaphor have more than one possible reference in the same or other sentence. In the given sentence, ‘they’ refers to the school staffs which is actually not present in the given text. [For more information, please refer here]

 

*************




Top interview questions in NLP

NLP quiz questions with answers explained

Online NLP quiz with solutions

question and answers in natural language processing

important quiz questions in nlp for placement

Top 5 important questions with answers in natural language processing

Friday, September 25, 2020

Natural Language Processing (NLP) Multiple Choice Questions with answers

Top 5 MCQ on NLP, NLP quiz questions with answers, NLP MCQ questions, Solved questions in natural language processing, NLP practitioner exam questions


Multiple Choice Questions in NLP 

1. Zipf's law tells us:

(a) head words take major portion in English vocabulary;

(b) in a given corpus, if the most frequent word's frequency is 1, then the second frequent word's frequency is around 0.5;

(c) comparing to tail words, removing head words helps more to reduce the storage of documents represented by a vector space model when using a dense matrix data structure;

(d) smoothing is not necessary.

Answer: (b) in a given corpus, if the most frequent word's frequency is 1, then the second frequent word's frequency is around 0.5

Zipf’s law:

Zipf's law states that given a large sample of words used, the frequency of any word is inversely proportional to its rank in the frequency table. So a word that appears at position number n has a frequency proportional to 1/n.

The law examines the frequency of words in natural language and how the most common word occurs twice as often as the second most frequent word, three times as often as the subsequent word and so on until the least frequent word.

 

2. Which of the following is NOT a good example of cohesive device?

(a) Discourse markers

(b) Pronouns

(c) Prepositions

(d) Demonstratives

Answer: (c) Prepositions

Cohesive devices, sometimes called linking words, linkers, connectors, discourse markers or transitional words.

Cohesive Devices are words or phrases that show the relationship between paragraphs or sections of a text or speech. Cohesive devices are words like ‘For example‘, ‘In conclusion‘, ‘however‘ and ‘moreover‘. [For more, refer here]

 

3. In the sentence, “I want a cute teddy for my birthday”, the underlined part is an example of _____.

(a) Gerund phrase

(b) Verb phrase

(c) Prepositional phrase

(d) Adverbial phrase

Answer: None of these options are correct

The underlined text is a noun phrase.

A noun phrase is a word or group of words containing a noun and functioning in a sentence as subject, object, or prepositional object. It is a phrase that has a noun as its head or performs the same grammatical function as a noun.

In a simple definition, noun phrase is a group of words that function like a noun. Noun phrases are nouns with modifiers.

 

4. Which of the following is an advantage of GLoVE?

(a) The data can be fed into the model in an online way and needs little preprocessing, thus requires little memory.

(b) The model is trained on the co-occurrence matrix of words, which takes a lot of memory for storage.

(c) The model can quickly give different sized vectors via matrix factorization

(d) It prevents meaningless stop words like “the”, “an”.

Answer: both (c) and (d)

(c) Since the co-occurrence matrix is pre-computed GLoVE can quickly give different sized vectors via matrix factorization for which many efficient implementations are available.

(d) It gives lower weight for highly frequent word pairs so as to prevent the meaningless stop words like “the”, “an” will not dominate the training progress.

 

5. How to use WordNet to measure semantic relatedness between words:

(a) Measure the shortest path between two words on WordNet

(b) Count the number of shared parent nodes

(c) Measure the difference between their depths in WordNet

(d) Measure the difference between the size of child nodes they have.

Answer: (a) Measure the shortest path between two words on WordNet

WordNet is a lexical database of semantic relations between words.

Measuring the shortest path (ie., minimal number of edges) between two words in WordNet. The path length measure, Leacock-Chodorow, and Hirst & St-Onge are the similarity measures based on the shortest path concept. [A possible reference] 

 

*************




Top interview questions in NLP

NLP quiz questions with answers explained

Online NLP quiz with solutions

question and answers in natural language processing

important quiz questions in nlp for placement

Top 5 important questions with answers in natural language processing

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents

data recovery