Levels and Stages of NLP – HOT MCQs with Instant Answers
Explore how Natural Language Processing progresses through multiple linguistic levels — lexical, syntactic, semantic, discourse, and pragmatic — each contributing to understanding human language computationally.
A. Pragmatic analysis
B. Lexical / morphological analysis
C. Discourse integration
D. Semantic analysis
Answer: B. Lexical / morphological analysis
Explanation: Tokenization, stemming, and lemmatization are performed during lexical analysis to prepare words for further syntactic and semantic processing.
What is Lexical Analysis?
Lexical analysis is the foundational phase in natural language processing that transforms raw character streams into structured, meaningful units called tokens. It serves as the bridge between unprocessed source code and higher-level syntactic processing, operating as the first critical stage in compilation or language interpretation. It focuses on the structure and meaning of individual words.
Example:
Tokenization: “Cats are cute.” → [‘Cats’, ‘are’, ‘cute’, ‘.’]
Normalization: “Running” → “running”; “The CAT sat on a Mat!” → “the cat sat on a mat”
Lemmatization: “Better” → “good”; “running” → “run”
A. Lexical analysis
B. Syntactic analysis (Parsing)
C. Semantic interpretation
D. Pragmatic inference
Answer: B. Syntactic analysis (Parsing)
Explanation: Syntactic analysis verifies sentence grammar and constructs parse trees to represent word relationships and structure.
What is Syntactic Analysis?
Syntactic analysis, also known as parsing, is the stage in Natural Language Processing (NLP) where the grammatical structure of a sentence is examined. At this stage, the system checks how words combine to form phrases and sentences according to grammar rules. The output of syntactic analysis is often a parse tree or a dependency graph, which shows the relationships between words (like subject–verb–object).
Example:
Sentence: “The cat sat on the mat.”
- “The cat” → noun phrase (NP)
- “sat on the mat” → verb phrase (VP)
A. Lexical analysis
B. Syntactic analysis
C. Semantic analysis
D. Discourse integration
Answer: C. Semantic analysis
Explanation: Semantic analysis interprets word and sentence meanings, resolving word senses and relationships to derive literal understanding.
What is Semantic Analysis?
In linguistics, semantic analysis is the process of relating syntactic structures—from individual words and phrases to complete sentences and paragraphs—to their language-independent meanings. It extends beyond simple keyword recognition to grasp the nuances of human communication, including context, emotions, and sentiments from unstructured data.
In simpler terms, it is about identifying the relationships between words, their meanings in context, and how they combine to convey a coherent idea.
Example:
Sentence: “The cat chased the mouse.”
Semantic analysis identifies:
- Action: chased
- Agent (doer): cat
- Theme (receiver of action): mouse
So the extracted meaning: A cat performed the action of chasing a mouse.
A. Morphological processing
B. Syntactic parsing
C. Semantic mapping
D. Discourse integration
Answer: D. Discourse integration
Explanation: Discourse analysis ensures coherence and reference resolution across sentences, linking them into a meaningful whole.
What is Discourse Analysis?
Discourse analysis is the stage in Natural Language Processing (NLP) (and linguistics) that focuses on understanding how sentences connect to form coherent text or conversation — not just individual sentences. It is the study of the relationship between language and the contexts in which language is used.
It examines context, reference, topic flow, and relationships between sentences to interpret meaning at the document or conversation level.
In simple words, it looks at how multiple sentences relate to each other.
Example:
Text: “John bought a new laptop. He loves how fast it runs.”
Discourse analysis identifies that:
- “He” refers to John (coreference resolution).
- The second sentence elaborates on the first.
- Together, they form a coherent discourse about John and his laptop.
What is Discourse integration?
Discourse integration Discourse integration is a critical process in Natural Language Processing that connects and links sentences or phrases together to form a coherent understanding of a passage. Unlike syntactic and semantic analysis, which focuses on individual sentences and their internal meanings, discourse integration operates at a higher level, examining relationships between sentences and how they collectively contribute to overall meaning and context.
Discourse analysis identifies the relationships between sentences. Discourse integration goes further — it combines those meanings into a single, coherent understanding.
A. Lexical analysis
B. Semantic analysis
C. Discourse integration
D. Pragmatic analysis
Answer: D. Pragmatic analysis
Explanation: Pragmatic analysis interprets the intent behind language, using speaker context and social cues beyond literal meaning.
What is Pragmatic analysis?
Pragmatic analysis interprets what a sentence really means in its context, not just what the words say. It deals with implicatures, presuppositions, speech acts, and situational context.
Pragmatic analysis focuses on the practical aspects of language use—examining not just what is said, but what is meant and how language accomplishes communicative goals. While semantic analysis examines literal or conventional meanings of words and sentences in isolation, pragmatic analysis reveals the intended, non-literal meanings that depend on context, speaker intention, and shared knowledge between communicators.
It answers: "What does the speaker really mean by saying this in this context?"
Example 1: “Can you pass the salt?”
Literal meaning (semantic): Asking about someone’s ability to pass salt.
Pragmatic meaning: A polite request, not a question about ability.
Example 2: “It’s cold in here.”
Semantic meaning: The temperature is low.
Pragmatic meaning: The speaker may be asking to close the window or turn on the heater.
A. Data modelling
B. Data pre-processing / cleaning
C. Model evaluation
D. Results deployment
Answer: B. Data pre-processing / cleaning
Explanation: Pre-processing (cleaning, tokenizing, normalizing) prepares text for feature extraction and model input.
A. Lexical / morphological
B. Feature engineering
C. Pragmatic
D. Discourse
Answer: B. Feature engineering
Explanation: Feature engineering is a machine learning step, not a linguistic level. The classical NLP levels are lexical, syntactic, semantic, discourse, and pragmatic.
A. Tokenization
B. Lexical analysis
C. Feature engineering / representation
D. Pragmatic interpretation
Answer: C. Feature engineering / representation
Explanation: Word embeddings convert lexical units into numerical features, bridging linguistic processing with machine learning.
This may seems contradicted with the answer of Question 7. Here's the justification.
When we create word embeddings (like Word2Vec, GloVe, BERT embeddings), we’re converting text into numeric vector representations — a process used for machine learning models, not linguistic parsing. This belongs to the feature engineering or feature representation phase, where the goal is to make text understandable to algorithms, not to linguistically analyze its meaning.
A. Lexical ambiguity
B. Syntactic parsing
C. Discourse integration
D. Tokenization
Answer: C. Discourse integration
Explanation: Discourse integration links sentences, resolving references and maintaining context in connected text.
A. Lexical / morphological
B. Syntactic
C. Semantic
D. Tokenization
Answer: C. Semantic
Explanation: Semantic analysis determines the correct meaning of ambiguous words or phrases in context, ensuring proper interpretation.

No comments:
Post a Comment