Top 10 New MCQs on SVM Concepts (2025 Edition)
1. Which of the following best describes the margin in an SVM classifier?
A. Distance between two closest support vectors
B. Distance between support vectors of opposite classes
C. Distance between decision boundary and the nearest data point of any class
D. Width of the separating hyperplane
Answer: C
Explanation: The margin is the perpendicular distance from the decision boundary to the closest data point (support vector). SVM aims to maximize this margin.
2. In soft-margin SVM, the penalty parameter C controls what?
A. The kernel function complexity
B. The balance between margin width and classification errors
C. The learning rate during optimization
D. The dimensionality of transformed space
Answer: B
Explanation: Parameter C determines how much misclassification is tolerated. A large C → fewer violations, smaller margin; a small C → allows more violations, larger margin.
3. Which of the following statements about the kernel trick in SVM is true?
A. It explicitly computes higher-dimensional feature mappings
B. It avoids computing transformations by using inner products in the feature space
C. It can only be applied to linear SVMs
D. It reduces the number of support vectors required
Answer: B
Explanation: The kernel trick enables SVMs to work in high-dimensional spaces without explicitly computing the transformed features. It uses kernel functions to calculate inner products efficiently.
No comments:
Post a Comment