0
0
Apache Airflowdevops~20 mins

Azure operators in Apache Airflow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Azure Operators Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
💻 Command Output
intermediate
2:00remaining
Output of Azure Data Factory Pipeline Run Operator
What is the output of the following Airflow task using AzureDataFactoryRunPipelineOperator when the pipeline run is successful?
Apache Airflow
run_pipeline = AzureDataFactoryRunPipelineOperator(
    task_id='run_pipeline',
    pipeline_name='example_pipeline',
    azure_data_factory_conn_id='azure_data_factory_default',
    resource_group_name='my_resource_group',
    factory_name='my_factory'
)

result = run_pipeline.execute({})
print(result)
AReturns a dictionary containing the pipeline run ID and status 'Succeeded'
BReturns None because execute method does not return anything
CRaises AirflowException due to missing pipeline parameters
DReturns a string with the pipeline name only
Attempts:
2 left
💡 Hint
Think about what the operator returns after triggering a pipeline run.
Troubleshoot
intermediate
2:00remaining
Troubleshooting Azure Blob Storage Hook Connection Error
You have an Airflow task using AzureBlobStorageHook but it fails with a connection error. Which of the following is the most likely cause?
Apache Airflow
hook = AzureBlobStorageHook(azure_blob_conn_id='azure_blob_default')
container_client = hook.get_container_client('mycontainer')
container_client.list_blobs()
AThe hook does not support listing blobs in a container
BThe container name 'mycontainer' does not exist in Azure Blob Storage
CThe Azure connection ID 'azure_blob_default' is not configured or has wrong credentials
DThe Airflow worker does not have internet access
Attempts:
2 left
💡 Hint
Check the connection configuration in Airflow UI.
Configuration
advanced
2:00remaining
Configuring Azure Data Lake Storage Gen2 Hook with OAuth
Which configuration snippet correctly sets up AzureDataLakeStorageHook to authenticate using OAuth with a service principal?
A
hook = AzureDataLakeStorageHook(
    account_name='storage-account',
    sas_token='sas-token'
)
B
hook = AzureDataLakeStorageHook(
    account_name='storage-account',
    auth_type='SharedKey',
    client_id='client-id'
)
C
hook = AzureDataLakeStorageHook(
    connection_string='DefaultEndpointsProtocol=https;AccountName=storage-account;AccountKey=key;EndpointSuffix=core.windows.net'
)
D
hook = AzureDataLakeStorageHook(
    tenant_id='tenant-id',
    client_id='client-id',
    client_secret='client-secret',
    account_name='storage-account',
    auth_type='OAuth'
)
Attempts:
2 left
💡 Hint
OAuth requires tenant ID, client ID, and client secret.
🔀 Workflow
advanced
2:00remaining
Order of Tasks for Uploading File to Azure Blob Storage and Triggering Data Factory Pipeline
Given these Airflow tasks, which option shows the correct order to upload a file to Azure Blob Storage and then trigger an Azure Data Factory pipeline that processes the file?
A3,1,2,4
B3,2,1,4
C1,3,2,4
D2,1,3,4
Attempts:
2 left
💡 Hint
The file must be uploaded before the pipeline runs.
Best Practice
expert
2:00remaining
Best Practice for Handling Secrets in Azure Operators within Airflow
Which option describes the best practice for managing Azure credentials securely when using Azure operators in Airflow?
AHardcode client secrets directly in the DAG Python files for easy access
BStore credentials in Airflow Connections and use environment variables or Azure Key Vault integration to avoid hardcoding secrets
CUse plain text files on the Airflow server to store Azure credentials and read them in the DAG
DEmbed credentials in the operator parameters directly in the DAG code
Attempts:
2 left
💡 Hint
Think about security and secret management best practices.