Exam4Training

All of the following are common optimization techniques in deep learning to determine weights that represent the strength of the connection between artificial neurons EXCEPT?

All of the following are common optimization techniques in deep learning to determine weights that represent the strength of the connection between artificial neurons EXCEPT?
A . Gradient descent, which initially sets weights arbitrary values, and then at each step changes them.
B . Momentum, which improves the convergence speed and stability of neural network training.
C . Autoregression, which analyzes and makes predictions about time-series data.
D . Backpropagation, which starts from the last layer working backwards.

Answer: C

Explanation:

Autoregression is not a common optimization technique in deep learning to determine weights for artificial neurons. Common techniques include gradient descent, momentum, and backpropagation. Autoregression is more commonly associated with time-series analysis and forecasting rather than neural network optimization.

Reference: AIGP BODY OF KNOWLEDGE, which discusses common optimization techniques used in deep learning​.

Latest AIGP Dumps Valid Version with 100 Q&As

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund

Exit mobile version