Which of the following descriptions of distributed training is wrong?
Which of the following descriptions of distributed training is wrong?
A . In data parallelism, each computing node trains the complete model based on local data
B . Distributed training can support the training of massive data and complex models
C . Data parallelism and model parallelism are common distributed training methods
D . In model parallelism, each compute node will only interact with the trained sub-model and obtain the final model through model aggregation
Answer: D
Latest H12-921_V1.0-ENU Dumps Valid Version with 239 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund
Subscribe
Login
0 Comments
Inline Feedbacks
View all comments