InterviewStack.io LogoInterviewStack.io

Loss Functions, Behaviors & Selection Questions

Loss function design, evaluation, and selection in machine learning. Includes common loss functions (MSE, cross-entropy, hinge, focal loss), how loss properties affect optimization and gradient flow, issues like class imbalance and label noise, calibration, and practical guidance for choosing the most appropriate loss for a given task and model.

EasyTechnical
0 practiced
Explain the practical difference between using sigmoid + binary cross-entropy per class (multi-label setup) versus softmax + categorical cross-entropy for mutually-exclusive classes. Include examples such as multi-tagging versus single-label classification, describe calibration and thresholding differences for sigmoid outputs, and mention implications for loss selection in production.
EasyTechnical
0 practiced
In PyTorch, implement binary cross-entropy (BCE) from logits as a function: def bce_from_logits(logits, targets): where targets are 0/1 floats. Do not call torch.nn.functional.binary_cross_entropy_with_logits directly; implement the numerically stable expression using log-sum-exp style decomposition. Mention edge cases such as extreme logits and perfect predictions.
EasyTechnical
0 practiced
Describe common strategies to address class imbalance at the loss level: class weighting, sample weighting, focal loss, resampling, and two-stage training (e.g., pretrain then finetune on balanced data). For each strategy list pros and cons and give recommendations for a production image classification pipeline with rare classes and limited annotation budget.
MediumSystem Design
0 practiced
Given a large image classification dataset suspected to contain noisy labels, outline a practical pipeline to detect and mitigate label noise at scale. Include steps for: training schedule to surface noisy examples, loss-based noisy sample identification (small-loss trick), ensembling predictions, robust loss selection, human-in-the-loop relabeling, and operational considerations for iterative retraining in production.
MediumTechnical
0 practiced
For an object detection model such as Faster R-CNN, explain the choice of classification and localization losses. Compare softmax cross-entropy versus focal loss for the classification head, and L1/smooth-L1 versus IoU/GIoU/DIoU losses for bounding box regression. Discuss trade-offs for small versus large datasets and dense detectors versus two-stage detectors.

Unlock Full Question Bank

Get access to hundreds of Loss Functions, Behaviors & Selection interview questions and detailed answers.

Sign in to Continue

Join thousands of developers preparing for their dream job.