InterviewStack.io LogoInterviewStack.io

Linear and Logistic Regression Implementation Questions

Covers the fundamentals and implementation details of linear regression for continuous prediction and logistic regression for binary or multiclass classification. Candidates should understand model formulation, hypothesis functions, and the intuition behind fitting a line or hyperplane for regression and using a sigmoid or softmax function for classification. Include loss functions such as mean squared error for regression and cross entropy loss for classification, optimization methods including gradient descent and variants, regularization techniques, feature engineering and scaling, metrics for evaluation such as mean absolute error and accuracy and area under curve, and hyperparameter selection and validation strategies. Expect discussion of practical implementation using numerical libraries and machine learning toolkits, trade offs and limitations of each approach, numerical stability, and common pitfalls such as underfitting and overfitting.

MediumTechnical
0 practiced
You have a binary classification problem with 0.5% positive class. Discuss strategies for training and evaluating logistic regression under severe class imbalance: resampling (oversample, undersample, SMOTE), class weighting in loss, threshold adjustment, metrics to optimize, and specialized loss functions like focal loss. Explain operational trade-offs for each approach.
EasyTechnical
0 practiced
Explain conceptually the differences between L1 and L2 regularization for linear models: show penalty terms, explain sparsity vs shrinkage effects, convexity and differentiability properties, and typical use cases where L1 is preferred over L2 and vice versa. Also state how to exclude the intercept from regularization.
HardTechnical
0 practiced
Implement a numerically stable softmax cross-entropy loss with per-class weighting in NumPy. Given logits of shape (n_samples, n_classes) and integer labels, compute both the scalar loss (weighted average) and the gradient with respect to logits, suitable for backpropagation. Explain how class weights affect gradient magnitudes and optimization.
HardTechnical
0 practiced
Design an A/B testing framework to compare two logistic regression models in production. Specify the unit of randomization, primary and guardrail metrics, sample size and statistical power calculations, handling of multiple comparisons or sequential testing, segmentation analysis, and how to prevent data leakage and contamination during the experiment.
MediumTechnical
0 practiced
Implement a numerically stable softmax function in Python using the log-sum-exp trick. The implementation must accept a 2D NumPy array of logits with shape (n_samples, n_classes), be robust to large logits such as [1000, 1001, 999], and return normalized probabilities per row.

Unlock Full Question Bank

Get access to hundreds of Linear and Logistic Regression Implementation interview questions and detailed answers.

Sign in to Continue

Join thousands of developers preparing for their dream job.