InterviewStack.io LogoInterviewStack.io

Data Processing and Transformation Questions

Focuses on algorithmic and engineering approaches to transform and clean data at scale. Includes deduplication strategies, parsing and normalizing unstructured or semi structured data, handling missing or inconsistent values, incremental and chunked processing for large datasets, batch versus streaming trade offs, state management, efficient memory and compute usage, idempotency and error handling, and techniques for scaling and parallelizing transformation pipelines. Interviewers may assess problem solving, choice of algorithms and data structures, and pragmatic design for reliability and performance.

MediumTechnical
0 practiced
Compare micro-batching and true streaming approaches for processing time-series sensor data with per-minute SLAs for alerts. Discuss latency, throughput, state complexity, operational overhead, and when each approach makes sense.
EasyTechnical
0 practiced
Explain the difference between ETL and ELT in the context of data processing and transformation for analytics. Describe typical use cases, the tradeoffs in where schema enforcement and compute happen, performance and cost implications, and how you would choose between ETL and ELT when ingesting transactional and event data at varying volumes.
MediumTechnical
0 practiced
You need to process a single 500 GB JSON file containing nested records on a machine with 1 GB memory and write a normalized CSV. Describe an algorithm and implementation approach (in Python) that processes the file in chunks, parses nested JSON safely, preserves order, and avoids OOM. Include how to handle malformed JSON lines.
MediumTechnical
0 practiced
Write a SQL query that flags outlier transactions per user where amount > mean + 3*stddev over that user's past 365 days. Given table transactions(transaction_id, user_id, amount numeric, occurred_at timestamp), assume sufficient historical data and prefer window functions. Explain performance considerations for large tables.
EasyTechnical
0 practiced
Describe the essential logging and metrics you would instrument in a production data transformation pipeline to support debugging and SLOs. Include observability for input volume, processing latency, error rates, data quality metrics like null-rate and distinct-count, and examples of alerts you would set.

Unlock Full Question Bank

Get access to hundreds of Data Processing and Transformation interview questions and detailed answers.

Sign in to Continue

Join thousands of developers preparing for their dream job.