InterviewStack.io LogoInterviewStack.io

Linear and Logistic Regression Implementation Questions

Covers the fundamentals and implementation details of linear regression for continuous prediction and logistic regression for binary or multiclass classification. Candidates should understand model formulation, hypothesis functions, and the intuition behind fitting a line or hyperplane for regression and using a sigmoid or softmax function for classification. Include loss functions such as mean squared error for regression and cross entropy loss for classification, optimization methods including gradient descent and variants, regularization techniques, feature engineering and scaling, metrics for evaluation such as mean absolute error and accuracy and area under curve, and hyperparameter selection and validation strategies. Expect discussion of practical implementation using numerical libraries and machine learning toolkits, trade offs and limitations of each approach, numerical stability, and common pitfalls such as underfitting and overfitting.

EasyTechnical
0 practiced
Implement a function in Python (numpy) that computes mean squared error (MSE) and its gradient with respect to model weights w and bias b for a linear regression model. Inputs: X (n_samples, n_features), y (n_samples,), w (n_features,), b (scalar). Return mse (scalar) and grad_w (n_features,), grad_b (scalar). Provide a fully vectorized implementation and ensure gradients are averaged over samples.
MediumTechnical
0 practiced
Implement coordinate descent for L1-regularized linear regression (Lasso) assuming standardized columns. Provide a Python function lasso_coordinate_descent(X, y, alpha, max_iter, tol) that returns coefficient vector. Explain why coordinate descent converges for L1 penalty and how soft-thresholding is used in updates.
HardTechnical
0 practiced
Derive the maximum likelihood estimator (MLE) for linear regression under Gaussian noise and show its equivalence to ordinary least squares. State assumptions required for unbiasedness and derive how L2 regularization corresponds to a Gaussian prior on weights (MAP estimate).
HardTechnical
0 practiced
As a senior AI Engineer, how would you mentor a junior engineer whose logistic regression training diverges (loss explodes)? Provide a step-by-step coaching and debugging plan including tests to write, things to check in the code base (learning rate, feature scaling, label issues), and how you would teach them diagnostics for numerical stability and data quality.
EasyTechnical
0 practiced
Implement a numerically stable sigmoid function in Python using numpy that avoids overflow for large positive or negative inputs. Include a short test with values like [-1000, -10, 0, 10, 1000] and explain why your implementation is stable.

Unlock Full Question Bank

Get access to hundreds of Linear and Logistic Regression Implementation interview questions and detailed answers.

Sign in to Continue

Join thousands of developers preparing for their dream job.