How should you create a dataset following Google-recommended best practices?
You have been asked to develop an input pipeline for an ML training model that processes images from disparate sources at a low latency. You discover that your input data does not fit in memory.
How should you create a dataset following Google-recommended best practices?
A . Create a tf.data.Dataset.prefetch transformation
B . Convert the images to tf .Tensor Objects, and then run Dataset. from_tensor_slices{).
C . Convert the images to tf .Tensor Objects, and then run tf. data. Dataset. from_tensors ().
D . Convert the images Into TFRecords, store the images in Cloud Storage, and then use the tf. data API to read the images for training
Answer: D
Latest Professional Machine Learning Engineer Dumps Valid Version with 60 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund
Subscribe
Login
0 Comments
Inline Feedbacks
View all comments