Exam4Training

How should you create a dataset following Google-recommended best practices?

You have been asked to develop an input pipeline for an ML training model that processes images from disparate sources at a low latency. You discover that your input data does not fit in memory.

How should you create a dataset following Google-recommended best practices?
A . Create a tf.data.Dataset.prefetch transformation
B . Convert the images to tf .Tensor Objects, and then run Dataset. from_tensor_slices{).
C . Convert the images to tf .Tensor Objects, and then run tf. data. Dataset. from_tensors ().
D . Convert the images Into TFRecords, store the images in Cloud Storage, and then use the tf. data API to read the images for training

Answer: D

Exit mobile version