Which of the following statements is false about choosing learning rate in gradient descent?
Which of the following statements is false about choosing learning rate in gradient descent? Correct Answer Small learning rate cause the training to progress very fast
If the learning rate is too small then the training will progress very slowly because the weight updating is very small. So, it leads to slow convergence. Whereas the large learning rate causes the loss function to fluctuate around the minimum and even can cause divergence.
মোঃ আরিফুল ইসলাম
Feb 20, 2025