Complete the code to import the Azure Blob Storage operator in Airflow.
from airflow.providers.microsoft.azure.operators.blob import [1]
The correct import is AzureBlobStorageOperator, which is the operator to interact with Azure Blob Storage in Airflow.
Complete the code to create an Azure Data Lake Storage operator with the correct task ID.
upload_task = AzureDataLakeStorageUploadOperator(task_id='[1]', ...)
The task ID should describe the task action. Since this operator uploads, upload_to_adls is the best choice.
Fix the error in the Azure Data Factory operator instantiation by completing the missing parameter.
adf_run = AzureDataFactoryRunPipelineOperator(pipeline_name='my_pipeline', [1]='my_resource_group', ...)
The correct parameter name is resource_group_name as per Airflow Azure Data Factory operator documentation.
Fill both blanks to create a dictionary comprehension that filters Azure Blob Storage blobs with size greater than 1000.
filtered_blobs = {blob.name: blob.size for blob in blobs if blob.size [1] [2]The condition filters blobs with size greater than 1000, so the operator is > and the value is 1000.
Fill all three blanks to create a dictionary comprehension that maps container names to blob counts where blob count is greater than 10.
container_blob_counts = {container.name: [1] for container, [2] in containers.items() if [3] > 10}All blanks relate to the variable blob_count which holds the count of blobs per container. The comprehension filters containers with blob counts greater than 10.