🔍 Introduction to POS Tagging MCQs
Part-of-Speech (POS) tagging is a core component of Natural Language Processing (NLP) where each word in a sentence is labeled with its grammatical function such as noun, verb, adjective, or adverb. It plays a vital role in applications like parsing, named entity recognition (NER), sentiment analysis, text-to-speech, and machine translation.
The following multiple-choice questions (MCQs) will help you practice rule-based, probabilistic, and deep learning-based tagging approaches, including Brill Tagging, Hidden Markov Models, CRF models, and modern transformer-based POS tagging.
📌 Before beginning, you may also explore: What Are Morphemes in NLP?
✔ Scroll down and test yourself — answers are hidden under the “View Answer” button.
A. To remove stop words
B. To assign grammatical roles to each word
C. To translate text into another language
D. To detect sentence boundaries
Explanation:
POS tagging assigns syntactic categories such as noun, verb, adjective, and adverb to each word so that the grammatical structure of the sentence can be understood by the NLP model.
What is POS tagging?
POS (Part‑of‑Speech) Tagging is the process of assigning each word in a sentence a grammatical category (noun, verb, adjective, etc.) based on its form and context. It’s the “glue” that turns raw text into a structured, linguistically‑annotated form that downstream NLP systems can consume.
Example:The word "book" can function as either a noun ("I read a book") or a verb ("I will book a flight"), and POS tagging disambiguates these different uses by analyzing the surrounding linguistic context.
Why POS tagging is important?
Helps a parser decide which part of a sentence is a subject vs. object. (Example tasks: Named‑entity recognition, sentiment analysis, machine translation, speech‑to‑text, text‑to‑speech.)
Allows the system to understand word ambiguities (e.g., “record” as a noun vs. verb). (Example tasks: Coreference resolution, information extraction, question answering.)
Enables feature engineering (e.g., “the current word is a determiner”). (Example tasks: POS‑based features for many classification tasks.)
A. Topic modeling
B. Sequence labeling
C. Clustering
D. Machine translation
Explanation:
POS tagging assigns a label to each token in a sentence in order, making it a sequence labeling task similar to Named Entity Recognition (NER) and chunking.
A. Hidden Markov Model (HMM)
B. Recurrent Neural Networks
C. Support Vector Machines
D. GAN Networks
Explanation:
Hidden Markov Models (HMMs) were among the earliest successful approaches for POS tagging because they model sequential probabilities and tag transitions effectively.
A. NNS
B. VBD
C. NNP
D. PRP
Explanation:
The tag NNP refers to a proper noun in singular form (e.g., India, Google, Sarah), whereas NNPS represents plural proper nouns.
A. Punctuation
B. Ambiguous words
C. Stopwords
D. Numbers
Explanation:
Words like "light", "book", or "play" may function as different parts of speech depending on context, creating ambiguity for tagging systems.
A. Handcrafted linguistic rules
B. Word embeddings
C. Subword tokenization
D. Training on large annotated datasets
Explanation:
Rule-based POS taggers rely on human-defined grammar rules and lexicons, rather than training data or machine learning algorithms.
A. BLEU Score
B. RMSE
C. Accuracy
D. Perplexity
Explanation:
The performance of POS taggers is typically evaluated using accuracy, which measures the proportion of correctly predicted tags.
A. One-hot encoding
B. Subword embeddings
C. Rule matching
D. Stopword filtering
Explanation:
Models like BERT use subword tokenization (e.g., WordPiece), which helps correctly process previously unseen or rare words.
A. NN
B. MD (modal verb)
C. VBD
D. IN
Explanation:
In this sentence, "can" is used as a modal verb expressing possibility, not as a noun meaning "container".
A. ImageNet
B. Penn Treebank
C. CIFAR-10
D. COCO dataset
Explanation:
The Penn Treebank contains syntactically annotated sentences widely used in POS tagging research and model training.
No comments:
Post a Comment