Data Science & Analytics Topics
Statistical analysis, data analytics, big data technologies, and data visualization. Covers statistical methods, exploratory analysis, and data storytelling.
Data Storytelling and Insight Communication
Skills for converting quantitative and qualitative analysis into a clear, persuasive narrative that guides stakeholders from findings to action. This includes leading with the headline insight, defining the business question, selecting the most relevant metrics and visual evidence, and structuring a concise story that explains what happened, why it happened, and what the recommended next steps are. Candidates should demonstrate tailoring of language and technical depth for diverse audiences from engineers to product managers to executives, summarizing trade offs and uncertainty in plain language, distinguishing correlation from causation, proposing follow up experiments or investigations, and producing concise executive summaries and status reports with an appropriate cadence. Interviewers evaluate the ability to persuade and align cross functional partners, answer questions about data validity and methodology, synthesize qualitative signals with quantitative results, and adapt presentation format and level of detail to the decision maker.
Developer Platform Analytics
Metrics and analytics specific to developer platforms and technical products, focusing on adoption and developer success. Topics include measuring developer adoption and growth, engagement metrics such as application programming interface calls and feature usage frequency, depth of usage across endpoints, expansion and monetization signals, cohort analysis by onboarding date, developer time to productivity, error rates and reliability indicators, developer satisfaction proxies, instrumentation strategies tailored for developer workflows, and how platform metrics differ from conventional software as a service metrics. Candidates should explain how they define success for developer users and how they instrument and analyze data to drive developer onboarding and retention.
Data Investigation and Root Cause Analysis
Techniques and a structured process for diagnosing metric changes and anomalies using quantitative evidence complemented by qualitative signals. Candidates should demonstrate how to validate that an observed change is a real signal and not noise or a reporting or instrumentation problem by checking data quality, event counts, sampling, and pipeline integrity. Describe slicing and decomposition strategies such as cohort segmentation, geography and platform segmentation, feature level analysis, time series decomposition to separate trend and seasonality, funnel and velocity analysis, retention analysis, and variance analysis. Explain how to form, prioritize, and test hypotheses; design diagnostic queries and tests using structured query language; and correlate metric changes with product releases, experiments, marketing activity, or external events. Include how to combine quantitative findings with qualitative research such as user interviews, session replay, logs, and support tickets to strengthen causal inference. Finally, cover communicating concise findings and actionable recommendations to stakeholders, creating reproducible queries and monitoring dashboards or alerts, and mentoring junior analysts on a systematic investigation approach.
Funnel and Cohort Analysis
Focuses on product analytics for acquisition, activation, retention and monetization. Topics include defining and instrumenting funnels and conversion events, building cohorts by acquisition date or behavior, computing retention and churn curves, measuring time to first value, segmenting by user or device attributes, diagnosing drop off points with event level analysis, running hypothesis driven experiments and split testing, calculating lifetime value and payback periods, and presenting clear actions based on analytic findings. Candidates should be able to explain required instrumentation, data quality checks, and how to convert analytic insights into prioritized product or growth experiments.
Data Driven Recommendations and Impact
Covers the end to end practice of using quantitative and qualitative evidence to identify opportunities, form actionable recommendations, and measure business impact. Topics include problem framing, identifying and instrumenting relevant metrics and key performance indicators, measurement design and diagnostics, experiment design such as A B tests and pilots, and basic causal inference considerations including distinguishing correlation from causation and handling limited or noisy data. Candidates should be able to translate analysis into clear recommendations by quantifying expected impacts and costs, stating key assumptions, presenting trade offs between alternatives, defining success criteria and timelines, and proposing decision rules and go no go criteria. This also covers risk identification and mitigation plans, prioritization frameworks that weigh impact effort and strategic alignment, building dashboards and visualizations to surface signals across HR sales operations and product, communicating concise executive level recommendations with data backed rationale, and designing follow up monitoring to measure adoption and downstream outcomes and iterate on the solution.
Measurement Design and Analysis
Practical measurement design and analytic techniques for producing reliable metric signals and proving impact. Includes instrumentation and tracking plans, experiment selection and validation, attribution modeling and its limitations, sample size and statistical considerations, identifying confounding variables, and reasoning about correlation versus causation. Also covers tradeoffs in data collection and data quality checks, cohort and segmentation design, baselining and threshold setting, designing dashboards and monitoring cadence, and connecting engineering and telemetry data to business outcomes. Candidates should be able to write clear measurement plans and success criteria, describe experiment and validation approaches, and explain how to operationalize results through reporting and iteration.