For this question, refer to the TerramEarth case study.
TerramEarth has equipped unconnected trucks with servers and sensors to collet telemetry data. Next year they want to use the data to train machine learning models. They want to store this data in the cloud while reducing costs .
What should they do?
A . Have the vehicle’ computer compress the data in hourly snapshots, and store it in a Google Cloud storage (GCS) Nearline bucket.
B . Push the telemetry data in Real-time to a streaming dataflow job that compresses the data, and store it in Google BigQuery.
C . Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Cloud Bigtable.
D . Have the vehicle’s computer compress the data in hourly snapshots, a Store it in a GCS Coldline bucket.
Answer: D
Explanation:
Coldline Storage is the best choice for data that you plan to access at most once a year, due to its slightly lower availability, 90-day minimum storage duration, costs for data access, and higher per-operation costs. For example:
Cold Data Storage – Infrequently accessed data, such as data stored for legal or regulatory reasons, can be stored at low cost as Coldline Storage, and be available when you need it.
Disaster recovery – In the event of a disaster recovery event, recovery time is key. Cloud Storage provides low latency access to data stored as Coldline Storage. References: https://cloud.google.com/storage/docs/storage-classes
Latest Professional Cloud Architect Dumps Valid Version with 168 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund