During batch training of a neural network, you notice that there is an oscillation in the loss.
How should you adjust your model to ensure that it converges?
A . Increase the size of the training batch
B . Decrease the size of the training batch
C . Increase the learning rate hyperparameter
D . Decrease the learning rate hyperparameter
Answer: D
Explanation:
https://developers.google.com/machine-learning/crash-course/introduction-to-neural-networks/playground-exercises
Latest Professional Machine Learning Engineer Dumps Valid Version with 60 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund