Which loss function should you use?

Your team needs to build a model that predicts whether images contain a driver’s license, passport, or credit card. The data engineering team already built the pipeline and generated a dataset composed of 10,000 images with driver’s licenses, 1,000 images with passports, and 1,000 images with credit cards. You now have to train a model with the following label map: [‘driversjicense’, ‘passport’, ‘credit_card’].

Which loss function should you use?
A . Categorical hinge
B . Binary cross-entropy
C . Categorical cross-entropy
D . Sparse categorical cross-entropy

Answer: C

Explanation:

Categorical cross-entropy is a loss function that is suitable for multi-class classification problems, where the target variable has more than two possible values. Categorical cross-entropy measures the difference between the true probability distribution of the target classes and the predicted probability distribution of the model. It is defined as: L – sum(y_i * log(p_i))

where y_i is the true probability of class i, and p_i is the predicted probability of class i. Categorical cross-entropy penalizes the model for making incorrect predictions, and encourages the model to assign high probabilities to the correct classes and low probabilities to the incorrect classes. For the use case of building a model that predicts whether images contain a driver’s license, passport, or credit card, categorical cross-entropy is the appropriate loss function to use. This is because the problem is a multi-class classification problem, where the target variable has three possible values: [‘drivers_license’, ‘passport’, ‘credit_card’]. The label map is a list that maps the class names to the class indices, such that ‘drivers_license’ corresponds to index 0, ‘passport’ corresponds to index 1, and ‘credit_card’ corresponds to index 2. The model should output a probability distribution over the three classes for each image, and the categorical cross-entropy loss function should compare the output with the true labels. Therefore, categorical cross-entropy is the best loss function for this use case.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments