You are designing a solution that will copy Parquet files stored in an Azure Blob storage account to an Azure Data Lake Storage Gen2 account.
The data will be loaded daily to the data lake and will use a folder structure of {Year}/{Month}/{Day}/.
You need to design a daily Azure Data Factory data load to minimize the data transfer between the two
accounts.
Which two configurations should you include in the design? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
A . Delete the files in the destination before loading new data.
B . Filter by the last modified date of the source files.
C . Delete the source files after they are copied.
D . Specify a file naming pattern for the destination.
Answer: B,D
Explanation:
Copy data from one place to another. The requirements are: 1- need to minimize transfert and 2- need to adapte data to the destination folder structure. Filter on LastModifiedDate will copy everything that have changed since the latest load while minimizing the data transfert. Specifying the file naming pattern allows to copy data at the right place to the destination Data Lake.
Latest DP-203 Dumps Valid Version with 116 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund