0
0
Snowflakecloud~10 mins

Integration with dbt and Airflow in Snowflake - Step-by-Step Execution

Choose your learning style9 modes available
Process Flow - Integration with dbt and Airflow
Start Airflow DAG
Trigger dbt run command
dbt compiles and runs models
dbt loads data into Snowflake
Airflow checks dbt run status
If success -> continue DAG
If failure -> alert or retry
End DAG
Airflow starts a workflow that triggers dbt to build data models in Snowflake, then monitors success or failure to continue or alert.
Execution Sample
Snowflake
dag = DAG('dbt_snowflake', schedule_interval='@daily')
run_dbt = BashOperator(task_id='run_dbt', bash_command='dbt run --profiles-dir ./profiles')
run_dbt
This Airflow DAG runs dbt commands daily to build models in Snowflake.
Process Table
StepActionCommand/CheckResultNext Step
1Start DAGAirflow scheduler triggers DAGDAG startedTrigger dbt run
2Trigger dbt runExecute 'dbt run' commanddbt starts compiling modelsWait for dbt completion
3dbt compiles modelsdbt parses SQL and YAML filesModels compiled successfullyRun models in Snowflake
4Run modelsdbt runs SQL in SnowflakeData loaded into Snowflake tablesCheck dbt run status
5Check statusdbt run exit codeSuccess (0)Continue DAG tasks
6DAG continuesNext Airflow tasks executeWorkflow proceedsEnd DAG
7DAG endsAll tasks completeDAG finished successfullyStop
💡 DAG ends after dbt run completes successfully and all tasks finish.
Status Tracker
VariableStartAfter Step 2After Step 4After Step 5Final
DAG StatusNot startedRunningRunningSuccessFinished
dbt Run StatusNot startedRunningRunningSuccessSuccess
Snowflake DataOld dataOld dataNew data loadedNew data loadedNew data loaded
Key Moments - 3 Insights
Why does Airflow wait after triggering the dbt run command?
Airflow waits to ensure dbt finishes compiling and running models before moving on, as shown in steps 2 to 5 in the execution_table.
What happens if dbt run fails during execution?
If dbt run fails, Airflow detects a non-zero exit code at step 5 and can trigger alerts or retries instead of continuing the DAG.
How does dbt load data into Snowflake?
dbt runs SQL models that create or update tables in Snowflake, shown in step 4 where data is loaded into Snowflake tables.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table, what is the DAG Status after step 4?
ARunning
BSuccess
CNot started
DFailed
💡 Hint
Check the variable_tracker row for DAG Status after Step 4.
At which step does Airflow check the dbt run exit code?
AStep 3
BStep 2
CStep 5
DStep 6
💡 Hint
Look at the 'Check status' action in the execution_table.
If dbt run fails, what would change in the execution_table?
AStep 4 would not run
BStep 5 result would be 'Failure' and DAG would alert or retry
CDAG would skip dbt run
DDAG would finish successfully anyway
💡 Hint
Refer to key_moments about failure handling and step 5 in execution_table.
Concept Snapshot
Integration with dbt and Airflow:
- Airflow DAG triggers dbt commands
- dbt compiles and runs models in Snowflake
- Airflow monitors dbt run status
- On success, DAG continues; on failure, alerts or retries
- Enables automated, scheduled data transformations
Full Transcript
This visual execution shows how Airflow and dbt work together to build data models in Snowflake. Airflow starts a DAG that runs the dbt command. dbt compiles SQL models and runs them in Snowflake, loading data into tables. Airflow waits for dbt to finish and checks the exit status. If successful, the DAG continues; if not, it can alert or retry. Variables like DAG Status and dbt Run Status change from not started to running to success as the workflow progresses. This integration automates data transformations on a schedule.