Multiple choices questions in Machine learning. Interview questions on machine learning, quiz questions for data scientist answers explained, What is gradient descent? Gradient descent is an optimization algorithm used to find the values of parameters of a function that minimizes a cost funtion
Machine Learning MCQ - What is learning rate in gradient descent
1. Which of the following statements is true about the learning rate alpha in gradient descent?
a) If alpha is very small, gradient descent will be fast to converge. If alpha is too large, gradient descent will overshoot
b) If alpha is very small, gradient descent can be slow to converge. If alpha is too large, gradient descent will overshoot
c) If alpha is very small, gradient descent can be slow to converge. If alpha is too large, gradient descent can be slow too
d) If alpha is very small, gradient descent will be fast to converge. If alpha is too large, gradient descent will be slow
Answer: (b) If alpha is very small, gradient descent can be slow. If alpha is too large, gradient descent will overshoot
What is learning rate?Learning rate (alpha) is a hyper-parameter used to control the rate at which an algorithm updates the parameter estimates or learns the values of the parameters. It is used to scale the magnitude of parameter updates during gradient descent. Learning rate is a scalar, a value that tells the machine how fast or how slow to arrive at some conclusion.
Effect of small and large learning ratesIf the learning rate (alpha) is too small, learning will take so long to converge. On the other hand, if the learning rate is too large, during learning, it may miss to converge.
What is gradient descent?In machine learning, gradient descent is an optimization algorithm which is used to learn the model parameters. This algorithm works in iteration to find the local minimum of a cost function. |