Which of the following statements is not true about stochastic gradient descent?
Which of the following statements is not true about stochastic gradient descent? Correct Answer It is computationally slower
Stochastic gradient descent (SGD) is not computationally slower but is faster, as only one sample is processed at a time. All other three are the disadvantages of SGD. Where the frequent updates make noisy steps and make it to achieve convergence to the minima very slowly. And it is computationally expensive also.
মোঃ আরিফুল ইসলাম
Feb 20, 2025