Sunday, August 9, 2020

Machine Learning TRUE or FALSE Questions with Solution 15

 Machine learning exam questions, ML solved quiz questions, Machine Learning TRUE or FALSE questions

Machine Learning TRUE / FALSE Questions - SET 15

 1. Selecting the decision tree split (at each node as you move down the tree) that maximizes information gain will guarantee an optimal decision tree.

(a) TRUE                                                   (b) FALSE

 

2. If we know that the conditional independence assumptions made by Naïve Bayes are not true for our problem, and we have lots of training data, we might prefer Logistic Regression over Naive Bayes for a particular learning task.

(a) TRUE                                                   (b) FALSE

 

3. The claim by a project team that their method is good based on the low training error that they reported is correct.

(a) TRUE                                                   (b) FALSE

 

4. A project team split their data into training and test. Using their training data and cross-validation, they chose the best parameter setting. They built a model using these parameters and their training data, and then report their error on test data.

(a) CORRECT                                           (b) NOT CORRECT

 

5. Reducing the number of leaves in a decision tree will increase the Bias and decrease the Variance

(a) TRUE                                                   (b) FALSE

 

6. Increase the number of training examples in logistic regression will eventually decrease the Bias and increase the Variance

(a) TRUE                                                   (b) FALSE

 

7. A classifier that attains 100% accuracy on the training set and 70% accuracy on test set is better than a classifier that attains 70% accuracy on the training set and 75% accuracy on test set.

(a) TRUE                                                   (b) FALSE

 

8. If you train linear regression estimator with only half the data, the bias will be smaller.

(a) TRUE                                                   (b) FALSE

 

9. In a machine learning algorithm, if the number of parameters grow with the amount of training data, then the model is non-parametric.

(a) TRUE                                                   (b) FALSE

 

10. Suppose your model is demonstrating high bias across different training sets. Increase the model complexity would reduce the bias.

(a) TRUE                                                   (b) FALSE

 

Answers:

1) FALSE

2) TRUE

3) FALSE (training error is an optimistic estimator of test error. Low training error does not tell much about the generalization performance of the model. To prove that a method is good they should report their error on independent test)

4) CORRECT

5) TRUE

6) FALSE (No change occurs in Bias and Variance will decrease)

7) FALSE (The second classifier has better test accuracy which reflects the true ccuracy, whereas the first classifier is overfitting.)

8) FALSE (Bias depends on the model used, not on the number of training data)

9) TRUE

10) TRUE

*********************

Related links:


top 5 questions in machine learning

quiz questions for data scientists

less bias vs high bias data science quiz online

online quiz questions on machine learning

true or false quiz on machine learning and data science

k-means, k-medoids, and nearest neighbors

logistic regression can be kernelized

joint probability computation in bayesian network

machine learning multiple choice questions

What is overfitting

top 5 machine learning interview questions

machine learning exam questions

what are the solutions for handling overfitting in neural network

 

1 comment:

  1. In Machine Learning, decisions are based on the type of algorithm selected.

    ReplyDelete

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents

data recovery