Which of the following statements is true about stochastic gradient descent?

Which of the following statements is true about stochastic gradient descent? Correct Answer It processes one training example per iteration

Stochastic gradient descent processes one training example per iteration. That is it updates the weight vector based on one data point at a time. All other three are the features of Batch Gradient Descent.

Related Questions

Which of the following statements is not true about the stochastic gradient descent?
Which of the following statements is not true about stochastic gradient descent?
Which of the following statements is not true about stochastic gradient descent for regularised loss minimisation?
Which of the following is not a variant of stochastic gradient descent?