Showing posts with label Machine Learning Quiz. Show all posts
Showing posts with label Machine Learning Quiz. Show all posts

Saturday, October 2, 2021

Machine Learning Multiple Choice Questions with Answers 32

Top 3 Machine Learning Quiz Questions with Answers explanation, Interview questions on machine learning, quiz questions for data scientist answers explained, machine learning exam questions, question bank in machine learning, k-nearest neighbor, decision tree, linear regression

Machine learning Quiz Questions - Set 32


˂ Previous                      

Next ˃

 

1. Which of the following is true about regularized linear regression model?

a) Increase in regularization parameter (lambda) will make the model to underfit the data and the validation error will go up.

b) Decrease in regularization parameter (lambda) will make the model to overfit the data and the training error go up

c) Increase in regularization parameter (lambda) will make the model to underfit the data and the training error go down

d) All of the above are true

Answer: (a) Increase in regularization parameter (lambda) will make the model to underfit the data and the validation error will go up.

Regularization parameter (tuning parameter) λ, used in the regularization techniques, controls the impact on bias and variance. As the value of λ rises, it reduces the value of coefficients and thus reducing the variance. Till a point, this increase in λ is beneficial as it is only reducing the variance (hence avoiding overfitting), without loosing any important properties in the data. But after certain value, the model starts loosing important properties, giving rise to bias in the model and thus underfitting. [Refer here for more.]

 

2. Which of the following is a characteristic of decision tree?

a) High variance

b) High bias

c) Smoothness of prediction surfaces

d) Low variance

Answer: (a) High variance

A model has high variance if it is very sensitive to (small) changes in the training data. Decision trees are generally unstable considering that a small change in the data set can result in a very different set of splits. This results in high variance. This is mainly due to the hierarchical nature of decision trees, since a change in split points in the initial stages will affect all the subsequent splits.

 

3. Let us consider single-link and complete-link hierarchical clustering. In which of these approaches, it is possible for a point to be closer to points in other clusters than the points in its own cluster?

a) It is possible in single-link clustering

b) It is possible in complete-link clustering

c) Both in single-link and complete-link clustering

d) Neither in single-link nor in complete-link values

Answer: (c) Both in single-link and complete-link clustering

This is possible in both single-link and complete-link clustering. In the single-link case, an example would be two parallel chains where many points are closer to points in the other chain/cluster than the points in their own cluster. In the complete-link case, this notion is more intuitive due to the clustering constraint (measuring distance between two clusters by the distance between their farthest points).

Both single-link (aka single-linkage clustering) and complete-link (aka complete-linkage clustering) are methods of agglomerative hierarchical clustering. In single-link clustering, the similarity of two clusters is the similarity of their most similar members whereas in complete-linkage clustering, the similarity of two clusters is the similarity of their most dissimilar members.

 


˂ Previous                      

Next ˃

 

**********************

Related links:

What is the difference between single-linkage and complete-linkage clustering?

Which among single-linkage and complete-linkage clustering, a point can be closer to points in other cluster than its own cluster?

Why decision trees are prone to high variance?

What is regularized linear regression model?

Saturday, September 25, 2021

Machine Learning Multiple Choice Questions with Answers 31

Top 3 Machine Learning Quiz Questions with Answers explanation, Interview questions on machine learning, quiz questions for data scientist answers explained, machine learning exam questions, question bank in machine learning, k-nearest neighbor, decision tree, linear regression

Machine learning Quiz Questions - Set 31

 

1. Which of the following machine learning algorithms has both training and test phases?

a) k-Nearest Neighbor

b) Linear regression

c) Case-based reasoning

d) None of the above

Click here to view answer and explanation


 

2. Given a kNN classifier, which one of the following statements is true?

a) The more examples are used for classifying an example, the higher accuracy we obtain

b) The more attributes we use to describe the examples the more difficult is to obtain high accuracy

c) The most costly part of this method is to learn the model

d) We can use kNN for classification only

Click here to view answer and explanation


 

3. Decision trees can work with

a) Only numeric values

b) Only nominal values

c) Both numeric and nominal values

d) Neither numeric nor nominal values

Click here to view answer and explanation


 

**********************

Related links:

When it is difficult to achieve high accuracy in knn?

Multiple choice quiz questions in machine learning

How to use categorical data in decision trees?

List the machine learning algorithms that uses both training and testing

Sunday, August 8, 2021

Machine Learning Multiple Choice Questions with Answers 30

Top 3 Machine Learning Quiz Questions with Answers explanation, Interview questions on machine learning, quiz questions for data scientist answers explained, machine learning exam questions, question bank in machine learning, lazy learner, k-nearest neighbor, eager learner, instance-based learning

 

Machine learning Quiz Questions - Set 30

  

1. Classification is a type of

a) Reinforcement learning

b) Supervised learning

c) Unsupervised learning

d) None of the above

Click here to view answer and explanation


 

2. k-nearest neighbor (k-NN) is a type of ____ method.

a) Reinforcement learning

b) Unsupervised learning

c) Instance-based learning

d) Lazy learning

Click here to view answer and explanation


 

3. Which of the following is a disadvantage of lazy learning?

a) Difficulty to maintain

b) Not fit for simultaneous application to multiple problems

c) Not suitable for complex and incomplete problem domains

d) Difficult to handle excessively noise data

Click here to view answer and explanation


 

 

**********************

Related links:

What is eager learner?

Multiple choice quiz questions in machine learning

What is supervised learning in machine learning

What is instance-based learning?

Which of the classification algorithm is a lazy learner

Why classification is a supervised learning? 

What is the difference between instance-based learning and lazy learning?

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents

data recovery