In stochastic gradient descent the high variance frequent parameter updates causes the loss function to fluctuate heavily.

In stochastic gradient descent the high variance frequent parameter updates causes the loss function to fluctuate heavily. Correct Answer True

In stochastic gradient descent the frequent parameter updates have high variance and cause the loss function (objective function) to fluctuate to different intensities. The high variance parameter updates helps to discover better local minima but at the same time it complicates the convergence (unstable convergence) to the exact minimum.

Related Questions

Which of the following statements is not true about stochastic gradient descent for regularised loss minimisation?
Which of the following statements is true about stochastic gradient descent?
Which of the following statements is not true about the stochastic gradient descent?
Which of the following statements is not true about stochastic gradient descent?