In stochastic gradient descent the high variance frequent parameter updates causes the loss function to fluctuate heavily.
In stochastic gradient descent the high variance frequent parameter updates causes the loss function to fluctuate heavily. Correct Answer True
In stochastic gradient descent the frequent parameter updates have high variance and cause the loss function (objective function) to fluctuate to different intensities. The high variance parameter updates helps to discover better local minima but at the same time it complicates the convergence (unstable convergence) to the exact minimum.
মোঃ আরিফুল ইসলাম
Feb 20, 2025