Which of the following statements is not true about stochastic gradient descent?

Which of the following statements is not true about stochastic gradient descent? Correct Answer It is computationally slower

Stochastic gradient descent (SGD) is not computationally slower but is faster, as only one sample is processed at a time. All other three are the disadvantages of SGD. Where the frequent updates make noisy steps and make it to achieve convergence to the minima very slowly. And it is computationally expensive also.

Related Questions

Which of the following statements is not true about the stochastic gradient descent?
Which of the following statements is not true about stochastic gradient descent for regularised loss minimisation?
Which of the following statements is true about stochastic gradient descent?
Which of the following is not a variant of stochastic gradient descent?