You need to ensure the data loading activities in the AnalyticsPOC workspace are executed in the appropriate sequence. The solution must meet the technical requirements.
What should you do?
A . Create a pipeline that has dependencies between activities and schedule the pipeline.
B . Create and schedule a Spark job definition.
C . Create a dataflow that has multiple steps and schedule the dataflow.
D . Create and schedule a Spark notebook.
Answer: A
Explanation:
To meet the technical requirement that data loading activities must ensure the raw and cleansed data is updated completely before populating the dimensional model, you would need a mechanism that allows for ordered execution. A pipeline in Microsoft Fabric with dependencies set between activities can ensure that activities are executed in a specific sequence. Once set up, the pipeline can be scheduled to run at the required intervals (hourly or daily depending on the data source).
Latest DP-600 Dumps Valid Version with 55 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund