What should you do?
You deploy a real-time inference service for a trained model.
The deployed model supports a business-critical application, and it is important to be able to monitor the data submitted to the web service and the predictions the data generates.
You need to implement a monitoring solution for the deployed model using minimal administrative effort.
What should you do?
A . View the explanations for the registered model in Azure ML studio.
B . Enable Azure Application Insights for the service endpoint and view logged data in the Azure portal.
C . View the log files generated by the experiment used to train the model.
D . Create an ML Flow tracking URI that references the endpoint, and view the data logged by ML Flow.
Answer: B
Explanation:
Configure logging with Azure Machine Learning studio
You can also enable Azure Application Insights from Azure Machine Learning studio. When you’re ready to deploy your model as a web service, use the following steps to enable Application Insights:
Latest DP-100 Dumps Valid Version with 227 Q&As
Latest And Valid Q&A | Instant Download | Once Fail, Full Refund