Which of these is not a supported method of putting data into a partitioned table?
Which of these is not a supported method of putting data into a partitioned table?A . If you have existing data in a separate file for each day, then create a partitioned table and upload each file into the appropriate partition.B . Run a query to get the records for...
Which of the following IAM roles does your Compute Engine account require to be able to run pipeline jobs?
Which of the following IAM roles does your Compute Engine account require to be able to run pipeline jobs?A . dataflow.workerB . dataflow.computeC . dataflow.developerD . dataflow.viewerView AnswerAnswer: A Explanation: The dataflow.worker role provides the permissions necessary for a Compute Engine service account to execute work units for a Dataflow...
How would you query specific partitions in a BigQuery table?
How would you query specific partitions in a BigQuery table?A . Use the DAY column in the WHERE clauseB . Use the EXTRACT(DAY) clauseC . Use the __PARTITIONTIME pseudo-column in the WHERE clauseD . Use DATE BETWEEN in the WHERE clauseView AnswerAnswer: C Explanation: Partitioned tables include a pseudo column...
Which software libraries are supported by Cloud Machine Learning Engine?
Which software libraries are supported by Cloud Machine Learning Engine?A . Theano and TensorFlowB . Theano and TorchC . TensorFlowD . TensorFlow and TorchView AnswerAnswer: C Explanation: Cloud ML Engine mainly does two things: - Enables you to train machine learning models at scale by running TensorFlow training applications in...
To give a user read permission for only the first three columns of a table, which access control method would you use?
To give a user read permission for only the first three columns of a table, which access control method would you use?A . Primitive roleB . Predefined roleC . Authorized viewD . It's not possible to give access to only the first three columns of a table.View AnswerAnswer: C Explanation:...
Which operation is best suited for the above data processing requirement?
You are planning to use Google's Dataflow SDK to analyze customer data such as displayed below. Your project requirement is to extract only the customer name from the data source and then write to an output PCollection. Tom,555 X street Tim,553 Y street Sam, 111 Z street Which operation is...
What should you do?
You have Google Cloud Dataflow streaming pipeline running with a Google Cloud Pub/Sub subscription as the source. You need to make an update to the code that will make the new Cloud Dataflow pipeline incompatible with the current version. You do not want to lose any data when making this...
Which approach meets the requirements?
You need to compose visualizations for operations teams with the following requirements: Which approach meets the requirements?A . Load the data into Google Sheets, use formulas to calculate a metric, and use filters/sorting to show only suboptimal links in a table.B . Load the data into Google BigQuery tables, write...
How should you deduplicate the data most efficiency?
Your company uses a proprietary system to send inventory data every 6 hours to a data ingestion service in the cloud. Transmitted data includes a payload of several fields and the timestamp of the transmission. If there are any concerns about a transmission, the system re-transmits the dat a. How...
Which of the following is NOT one of the three main types of triggers that Dataflow supports?
Which of the following is NOT one of the three main types of triggers that Dataflow supports?A . Trigger based on element size in bytesB . Trigger that is a combination of other triggersC . Trigger based on element countD . Trigger based on timeView AnswerAnswer: A Explanation: There are...