❄️
Snowflake
3-Day Snowflake Interview Prep
❄️
❄️
Snowflake · Section 1 of 7

3-Day Snowflake Interview Prep

3-Day Snowflake Interview Prep

WHY SNOWFLAKE MATTERS FOR YOUR INTERVIEW

  • Amadeus JD mentions: "BigQuery" and cloud data warehouses — Snowflake is the #1 cloud DW
  • Interviewers often ask: "Have you worked with Snowflake? How does it compare to Databricks?"
  • Even if Amadeus uses Databricks, knowing Snowflake shows breadth of data platform knowledge
  • Many companies use BOTH — Databricks for ETL/ML + Snowflake for analytics/BI

3-DAY SCHEDULE

🗺️Memory Map
DAY 15-6 hoursARCHITECTURE & CORE CONCEPTS
Snowflake Architecture (3 layers: Storage, Compute, Cloud Services)
Micro-Partitions & Data Clustering
Virtual Warehouses (sizing, scaling, multi-cluster)
Caching (3 levels: Result, Local Disk, Remote Disk)
Time Travel & Fail-safe
Data Types (VARIANT, ARRAY, OBJECT, structured types)
Semi-Structured Data (FLATTEN, LATERAL, JSON/Parquet)
Snowflake vs Databricks (CRITICAL comparison question)
Stages (Internal, External, Named)
DAY 25-6 hoursDATA LOADING, PIPELINES & PERFORMANCE
COPY INTO (bulk loading — options, error handling)
Snowpipe (auto-ingest, REST API, Snowpipe Streaming)
Streams (CDC within Snowflake — Standard, Append-only)
Tasks (scheduled SQL, task trees, DAGs)
Dynamic Tables (auto-refreshing — replaces streams+tasks)
Snowpark (Python/Java/Scala on Snowflake — DataFrame API)
Performance Tuning (clustering keys, search optimization, query profiling)
Materialized Views
Query Optimization (pruning, pushdown, spilling)
Scenario: Design an ELT pipeline in Snowflake
DAY 35-6 hoursSECURITY, SHARING, COST & NEW FEATURES
RBAC (roles hierarchy, system roles, custom roles)
Data Masking (dynamic, static masking policies)
Row Access Policies (row-level security)
Network Policies & Private Link
Secure Data Sharing (shares, reader accounts, data clean rooms)
Snowflake Marketplace (data exchange)
Cost Management (warehouse sizing, auto-suspend, resource monitors)
NEW 2025-2026: Cortex AI, Iceberg Tables, Polaris Catalog
NEW 2025-2026: Gen 2 Warehouses, Snowpark Container Services
NEW 2025-2026: Unistore (Hybrid Tables), Native dbt
Snowflake vs Databricks — Detailed Comparison
Mock Interview Questions (10 most likely)

PRIORITY MATRIX

MUST KNOW (Will definitely be asked — 60%)

  1. Snowflake 3-layer architecture (storage, compute, cloud services)
  2. Micro-partitions & clustering keys
  3. Virtual warehouse sizing & multi-cluster warehouses
  4. Time Travel & cloning (zero-copy clone)
  5. Snowpipe & COPY INTO — data loading patterns
  6. Semi-structured data (VARIANT, FLATTEN)
  7. Snowflake vs Databricks — the #1 comparison question

SHOULD KNOW (High probability — 25%)

  1. Streams & Tasks (CDC + scheduling)
  2. Dynamic Tables (new way to do ELT)
  3. Caching (3 levels — result, local disk, remote disk)
  4. RBAC & role hierarchy (ACCOUNTADMIN, SYSADMIN, etc.)
  5. Data masking & row access policies
  6. Secure Data Sharing
  7. Cost management & resource monitors

NICE TO KNOW (Differentiators — 15%)

  1. Snowpark (Python DataFrame API on Snowflake)
  2. Cortex AI (LLM functions in SQL)
  3. Iceberg Tables & Polaris Catalog
  4. Snowpark Container Services
  5. Gen 2 Warehouses (2.1x faster)
  6. Unistore / Hybrid Tables (OLTP on Snowflake)
  7. Native dbt integration

FILES STRUCTURE

DayMain File (Deep Questions)Quick Recall File
PlanSF_00_INTERVIEW_PLAN.md
1SF_01_Architecture_Core.mdSF_01_Quick_Recall.md
2SF_02_Pipelines_Performance.mdSF_02_Quick_Recall.md
3SF_03_Security_Sharing_New.mdSF_03_Quick_Recall.md

Total files: 7 (1 plan + 3 main + 3 quick recall)

Each Main File will have:

  • 15-20 questions at all 3 levels (direct, mid-level, scenario-based)
  • Simple explanations with real-world analogies
  • Line-by-line commented code/SQL
  • Interview tips for each topic

Each Quick Recall File will have:

  • 🧠 Memory Maps (mnemonics, acronyms)
  • ⚡ Direct questions (one-liner flash cards)
  • 🔑 Mid-level questions (how/why/compare)
  • ⚠️ Common traps
  • Summary card for last-minute revision

LEARNING APPROACH

Same as Databricks prep:

🧠 INTERVIEW TIP → How to answer this confidently
WHAT IS IT?Simple 2-3 line explanation in plain English
WHY DO WE NEED IT?Real problem it solves (with travel/booking example)
HOW DOES IT WORK?Technical details + SQL with comments on every line
WHEN TO USE / NOT USE?Practical decision guide
INTERVIEW TIPHow to answer this confidently
MEMORY MAPMnemonic to never forget

SNOWFLAKE vs DATABRICKS — Quick Reference

📐 Architecture Diagram
┌──────────────────┬─────────────────────┬─────────────────────┐
│ Aspect           │ Snowflake           │ Databricks          │
├──────────────────┼─────────────────────┼─────────────────────┤
│ Core strength    │ Data Warehousing    │ Data Engineering/ML │
│ Language         │ SQL-first           │ Python/Scala-first  │
│ Storage format   │ Proprietary (micro) │ Delta Lake (open)   │
│ Compute          │ Virtual Warehouses  │ Spark Clusters      │
│ Semi-structured  │ VARIANT (native)    │ JSON in Delta cols  │
│ Data sharing     │ Secure Sharing      │ Delta Sharing       │
│ ML/AI            │ Cortex AI, Snowpark │ MLflow, MLlib       │
│ CDC              │ Streams             │ CDF (Change Feed)   │
│ ELT framework    │ Dynamic Tables      │ Lakeflow (DLT)      │
│ File format      │ Proprietary         │ Open (Parquet/Delta)│
│ Governance       │ Horizon Catalog     │ Unity Catalog       │
│ Pricing          │ Per-second credits  │ Per-second DBUs     │
│ Best for         │ SQL analytics/BI    │ Complex ETL/ML      │
└──────────────────┴─────────────────────┴─────────────────────┘

HOW TO USE

  1. Read SF_01/02/03 main files for deep understanding (questions + explanations + code)
  2. Read SF_01/02/03_Quick_Recall for memory maps and flash-card style review
  3. For last-minute: Read only Quick Recall Summary Cards (10 min per day)
  4. Always connect Snowflake to your Databricks knowledge: "I've used Databricks for X, and I know Snowflake handles this with Y"
  5. Frame with Amadeus: "In a travel data warehouse with billions of booking records..."

SOURCES (Research used for this plan)