A too-large weight initialized on the neural network can lead to vanishing gradient whereas a too-small weight initialized on the neural network can lead to exploding gradient.

A too-large weight initialized on the neural network can lead to vanishing gradient whereas a too-small weight initialized on the neural network can lead to exploding gradient. Correct Answer False

When the gradients of the cost with respect to parameters are too big then it can lead to exploding gradient problem. When the gradients of the cost with respect to the parameters are too small then it can lead to vanishing gradient problem.

Related Questions