✔ Scroll down and test yourself — answers are hidden under the “View Answer” button.
Attempt all questions first.
✔️ Click SUBMIT at the end to unlock VIEW ANSWER buttons.
Key Concepts Illustrated in the Figure
-
Visible states (Observations)
Visible states are the observed outputs of an HMM, such as words in a sentence. In the above figure, 'cat', 'purrs', etc are observations. -
Hidden states
Hidden states are the unobserved underlying states (e.g., POS tags - 'DT', 'N', etc in the figure) that generate the visible observations. -
Transition probabilities
Transition probabilities define the likelihood of moving from one hidden state to another. In the figure, this is represented by the arrows from one POS tag to the other. Example: P(N -> V) or P(V | N). -
Emission probabilities
Emission probabilities define the likelihood of a visible observation being generated by a hidden state. In the figure, this is represented by the arrows from POS tags to words. Example: P(cat | N). -
POS tagging using HMM
POS tagging using HMM models tags as hidden states and words as observations to find the most probable tag sequence. -
Evaluation problem
The evaluation problem computes the probability of an observation sequence given an HMM. -
Forward algorithm
The forward algorithm efficiently solves the evaluation problem using dynamic programming. -
Decoding problem
The decoding problem finds the most probable hidden state sequence for a given observation sequence.
In HMM-based POS tagging, tags are hidden states and words are observed symbols.
Viterbi decoding finds the most probable hidden tag sequence.
Transition probability models tag-to-tag dependency.
Emission probability is P(word | tag).
Baum–Welch (EM) learns transition and emission probabilities without labeled data.
Rows correspond to tags and columns to words.
Trigram models capture dependency on two previous tags.
Unseen words lead to zero emission probabilities without smoothing.
Smoothing assigns non-zero probabilities to unseen events.
Each token is labeled sequentially → classic sequence labeling.
No comments:
Post a Comment