Simple gradient descent is a better batch optimization method than conjugate gradients and quasi-newton methods.
Simple gradient descent is a better batch optimization method than conjugate gradients and quasi-newton methods. Correct Answer False
Conjugate gradients and quasi-newton methods are the more robust and faster batch optimization methods than simple gradient descent. In these algorithms the error function always decreases at each iteration unless the weight vector has arrived at a local or global minimum unlike gradient descent.
মোঃ আরিফুল ইসলাম
Feb 20, 2025