InterviewStack.io LogoInterviewStack.io

Python Fundamentals and Problem Solving Questions

Comprehensive knowledge of the Python programming language, idiomatic usage, and the ability to implement correct, readable, and testable solutions to coding problems. Core language elements include syntax and semantics, primitive and composite data types such as integers, floats, strings, lists, dictionaries, sets, and tuples, sequence and mapping operations, control flow constructs, functions and closures, and object oriented programming basics including classes, instances, inheritance, and special methods. Additional practical topics include error and exception handling, file input and output operations, comprehensions and generator expressions, generator functions and iteration protocols, context managers, lambda functions, unpacking, and common standard library utilities. Candidates should understand algorithmic time and space complexity for common operations, typical performance characteristics of lists and dictionaries, and common pitfalls such as mutable default arguments and shared mutable state. Interview focused expectations include writing clean correct code without editor assistance, sensible variable naming, implementing basic algorithms and data structure manipulations under time constraints, reasoning about tradeoffs and complexity, and demonstrating testability and code quality.

HardTechnical
60 practiced
Explain the mechanics of Python's memory management: reference counting and the generational garbage collector. For a long-running ETL job creating many short-lived cyclic structures, how would you detect leaks and mitigate them? Include examples using the `gc` module and common fixes.
MediumTechnical
47 practiced
Explain the Global Interpreter Lock (GIL) in CPython. For data-engineering workloads, contrast using `threading`, `multiprocessing`, and `asyncio` for IO-bound and CPU-bound tasks. Provide practical recommendations for which to choose in ETL jobs and why.
MediumTechnical
47 practiced
Using generator composition, implement a simple lazy pipeline that reads a stream of records, filters by a predicate, maps them via a transform function, and finally computes an aggregate. Your implementation should demonstrate backpressure-friendly behavior and not materialize intermediate results.
HardTechnical
95 practiced
When parallelizing ETL transformations with `multiprocessing.Pool`, what pitfalls should you watch for regarding pickling, memory duplication due to copy-on-write, and process startup overhead? Propose strategies to reduce memory usage and improve throughput while keeping code maintainable.
EasyTechnical
57 practiced
Write a Python generator function `read_csv_stream(path: str)` that yields each row as a dictionary mapping headers to values, using the built-in `csv` module. The function must stream rows (not load whole file), handle missing values gracefully, and be robust for very large files.

Unlock Full Question Bank

Get access to hundreds of Python Fundamentals and Problem Solving interview questions and detailed answers.

Sign in to Continue

Join thousands of developers preparing for their dream job.