Saturday, October 2, 2021

Machine Learning Multiple Choice Questions with Answers 32

Top 3 Machine Learning Quiz Questions with Answers explanation, Interview questions on machine learning, quiz questions for data scientist answers explained, machine learning exam questions, question bank in machine learning, k-nearest neighbor, decision tree, linear regression

Machine learning Quiz Questions - Set 32


˂ Previous                      

Next ˃

 

1. Which of the following is true about regularized linear regression model?

a) Increase in regularization parameter (lambda) will make the model to underfit the data and the validation error will go up.

b) Decrease in regularization parameter (lambda) will make the model to overfit the data and the training error go up

c) Increase in regularization parameter (lambda) will make the model to underfit the data and the training error go down

d) All of the above are true

Answer: (a) Increase in regularization parameter (lambda) will make the model to underfit the data and the validation error will go up.

Regularization parameter (tuning parameter) λ, used in the regularization techniques, controls the impact on bias and variance. As the value of λ rises, it reduces the value of coefficients and thus reducing the variance. Till a point, this increase in λ is beneficial as it is only reducing the variance (hence avoiding overfitting), without loosing any important properties in the data. But after certain value, the model starts loosing important properties, giving rise to bias in the model and thus underfitting. [Refer here for more.]

 

2. Which of the following is a characteristic of decision tree?

a) High variance

b) High bias

c) Smoothness of prediction surfaces

d) Low variance

Answer: (a) High variance

A model has high variance if it is very sensitive to (small) changes in the training data. Decision trees are generally unstable considering that a small change in the data set can result in a very different set of splits. This results in high variance. This is mainly due to the hierarchical nature of decision trees, since a change in split points in the initial stages will affect all the subsequent splits.

 

3. Let us consider single-link and complete-link hierarchical clustering. In which of these approaches, it is possible for a point to be closer to points in other clusters than the points in its own cluster?

a) It is possible in single-link clustering

b) It is possible in complete-link clustering

c) Both in single-link and complete-link clustering

d) Neither in single-link nor in complete-link values

Answer: (c) Both in single-link and complete-link clustering

This is possible in both single-link and complete-link clustering. In the single-link case, an example would be two parallel chains where many points are closer to points in the other chain/cluster than the points in their own cluster. In the complete-link case, this notion is more intuitive due to the clustering constraint (measuring distance between two clusters by the distance between their farthest points).

Both single-link (aka single-linkage clustering) and complete-link (aka complete-linkage clustering) are methods of agglomerative hierarchical clustering. In single-link clustering, the similarity of two clusters is the similarity of their most similar members whereas in complete-linkage clustering, the similarity of two clusters is the similarity of their most dissimilar members.

 


˂ Previous                      

Next ˃

 

**********************

Related links:

What is the difference between single-linkage and complete-linkage clustering?

Which among single-linkage and complete-linkage clustering, a point can be closer to points in other cluster than its own cluster?

Why decision trees are prone to high variance?

What is regularized linear regression model?

No comments:

Post a Comment

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents

data recovery