Advanced Database Management System - Tutorials and Notes: Machine Learning TRUE or FALSE Questions with Solution 16

Search Engine

Please visit, subscribe and share 10 Minutes Lectures in Computer Science

Tuesday, 8 September 2020

Machine Learning TRUE or FALSE Questions with Solution 16

 Machine learning exam questions, ML solved quiz questions, Machine Learning TRUE or FALSE questions

Machine Learning TRUE / FALSE Questions - SET 16

 

1. Using the kernel trick, one can get non-linear decision boundaries using algorithms designed originally for linear models.

(a) TRUE                                                   (b) FALSE

Answer: TRUE

Kernel trick solves the non-linear decision boundary problem.

Kernel trick is simply increasing the number of dimensions. It is to make the non-linear decision boundary in lower dimensional space as a linear decision boundary, in higher dimensional space.

In simple words, Kernel trick makes the non-linear decision boundary to linear (in higher dimensional space).

This is helpful in SVM. SVM works well if the data points are linearly separable. In non-linear boundary case, it is difficult for SVM to classify. In this case, we can use kernel trick to convert non-linear boundary to linear.

 

2. Zero correlation between any two random variables implies that the two random variables are independent.

                   (a) TRUE                                                   (b) FALSE

Answer: FALSE

If ρ(X, Y)=0 we say that X and Y are “uncorrelated”. If two variables are independent, then their correlation will be 0. A correlation of 0 does not imply independence.

If X and Y are uncorrelated, then they can still be dependent.

Example: Refer here http://mathforum.org/library/drmath/view/64808.html

 

3. In linear SVMs, the optimal weight vector w is a linear combination of training data points.

                   (a) TRUE                                                   (b) FALSE

Answer: TRUE

The optimal weight vector w is a linear combination of the training data points including training inputs and the training outputs, xi (data point) and yi (class).

 

 

4. The maximum likelihood estimate for the variance of a univariate Gaussian is unbiased.

                   (a) TRUE                                                   (b) FALSE

Answer: FALSE

The maximum likelihood estimate (MLE) for the variance of a univariate Gaussian is biased. But we can construct an unbiased estimator based on MLE.

The MLE estimator is a biased estimator of the population variance and it introduces a downward bias (underestimating the parameter). The size of the bias is proportional to population variance, and it will decrease as the sample size gets larger.

MLE

Maximum Likelihood Estimation (MLE) is a method of estimating the parameters of a statistical model. It is widely used in Machine Learning algorithm, as it is intuitive and easy to form given the data. The basic idea underlying MLE is to represent the likelihood over the data w.r.t the model parameters, then find the values of the parameters so that the likelihood is maximized.

 

5. With a non-linearly-separable dataset that contains some extra “noise” data points, using an SVM with slack variables to create a soft margin classifier, and a small value for the penalty parameter, C, that controls how much to penalize misclassified points, will often reduce overfitting the training data.

(a) TRUE                                                   (b) FALSE

Answer: TRUE

Small C means the penalty for mis-classifying a few points will be small and therefore we are more likely to maximize the margin between most of the points while mis-classifying a few points including the noise points.

 

 

*********************

Related links:


top 5 questions in machine learning

quiz questions for data scientists

less bias vs high bias data science quiz online

online quiz questions on machine learning

true or false quiz on machine learning and data science

k-means, k-medoids, and nearest neighbors

logistic regression can be kernelized

joint probability computation in bayesian network

machine learning multiple choice questions

What is overfitting

top 5 machine learning interview questions

machine learning exam questions

what are the solutions for handling overfitting in neural network

No comments:

Post a comment

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents