Challenge - 5 Problems
Azure Operators Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
💻 Command Output
intermediate2:00remaining
Output of Azure Data Factory Pipeline Run Operator
What is the output of the following Airflow task using AzureDataFactoryRunPipelineOperator when the pipeline run is successful?
Apache Airflow
run_pipeline = AzureDataFactoryRunPipelineOperator(
task_id='run_pipeline',
pipeline_name='example_pipeline',
azure_data_factory_conn_id='azure_data_factory_default',
resource_group_name='my_resource_group',
factory_name='my_factory'
)
result = run_pipeline.execute({})
print(result)Attempts:
2 left
💡 Hint
Think about what the operator returns after triggering a pipeline run.
✗ Incorrect
The AzureDataFactoryRunPipelineOperator triggers a pipeline and returns a dictionary with details including the run ID and status. This helps track the pipeline execution.
❓ Troubleshoot
intermediate2:00remaining
Troubleshooting Azure Blob Storage Hook Connection Error
You have an Airflow task using AzureBlobStorageHook but it fails with a connection error. Which of the following is the most likely cause?
Apache Airflow
hook = AzureBlobStorageHook(azure_blob_conn_id='azure_blob_default') container_client = hook.get_container_client('mycontainer') container_client.list_blobs()
Attempts:
2 left
💡 Hint
Check the connection configuration in Airflow UI.
✗ Incorrect
A connection error usually means the credentials or connection ID is missing or incorrect in Airflow's connection settings.
❓ Configuration
advanced2:00remaining
Configuring Azure Data Lake Storage Gen2 Hook with OAuth
Which configuration snippet correctly sets up AzureDataLakeStorageHook to authenticate using OAuth with a service principal?
Attempts:
2 left
💡 Hint
OAuth requires tenant ID, client ID, and client secret.
✗ Incorrect
OAuth authentication for Azure Data Lake Storage Gen2 requires tenant ID, client ID, client secret, and specifying auth_type as 'OAuth'.
🔀 Workflow
advanced2:00remaining
Order of Tasks for Uploading File to Azure Blob Storage and Triggering Data Factory Pipeline
Given these Airflow tasks, which option shows the correct order to upload a file to Azure Blob Storage and then trigger an Azure Data Factory pipeline that processes the file?
Attempts:
2 left
💡 Hint
The file must be uploaded before the pipeline runs.
✗ Incorrect
The workflow should start, upload the file, then trigger the pipeline, and finally end.
✅ Best Practice
expert2:00remaining
Best Practice for Handling Secrets in Azure Operators within Airflow
Which option describes the best practice for managing Azure credentials securely when using Azure operators in Airflow?
Attempts:
2 left
💡 Hint
Think about security and secret management best practices.
✗ Incorrect
Storing secrets in Airflow Connections and integrating with Azure Key Vault or environment variables keeps credentials secure and avoids exposure in code.