0
0
Apache Airflowdevops~20 mins

Why cloud operators simplify infrastructure tasks in Apache Airflow - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Cloud Operator Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
How do cloud operators reduce manual infrastructure management?

Cloud operators automate many infrastructure tasks. Which of the following best explains how they simplify these tasks?

AThey replace manual commands with automated workflows that run reliably without human intervention.
BThey require operators to write more manual scripts for each task to ensure control.
CThey increase the number of manual checks needed to maintain infrastructure health.
DThey remove all monitoring tools, so operators do not need to check system status.
Attempts:
2 left
💡 Hint

Think about how automation helps reduce repetitive work.

💻 Command Output
intermediate
2:00remaining
Output of an Airflow task automating infrastructure setup

Given this Airflow task code snippet that provisions a server, what will be the output when the task runs successfully?

Apache Airflow
from airflow import DAG
from airflow.operators.bash import BashOperator
from datetime import datetime
with DAG('infra_setup', start_date=datetime(2024, 1, 1), schedule_interval='@once') as dag:
    setup = BashOperator(
        task_id='provision_server',
        bash_command='echo Server provisioned successfully'
    )
ATask failed due to missing bash command
BError: DAG not found
CServer provisioned successfully
DNo output because task is skipped
Attempts:
2 left
💡 Hint

Look at the bash_command in the BashOperator.

🔀 Workflow
advanced
3:00remaining
Order of steps in a cloud operator's infrastructure automation workflow

Arrange the following steps in the correct order for a cloud operator automating infrastructure deployment using Airflow.

A1,2,3,4
B1,3,2,4
C2,1,3,4
D3,1,2,4
Attempts:
2 left
💡 Hint

Think about defining tasks before running them.

Troubleshoot
advanced
2:30remaining
Identifying cause of Airflow task failure in infrastructure automation

An Airflow task that provisions cloud resources fails with the error: 'Permission denied'. What is the most likely cause?

AThe Airflow scheduler is not running.
BThe Airflow worker does not have the required cloud API permissions.
CThe task's bash command is missing from the operator.
DThe DAG file has a syntax error preventing execution.
Attempts:
2 left
💡 Hint

Consider what 'Permission denied' usually means in cloud operations.

Best Practice
expert
3:00remaining
Best practice for managing secrets in Airflow cloud operator workflows

Which approach is best for securely managing cloud credentials used by Airflow operators in infrastructure automation?

AHardcode credentials directly in the DAG Python files for easy access.
BEmail credentials to the team and paste them into Airflow UI manually.
CSave credentials in plain text files on the Airflow worker nodes.
DStore credentials in Airflow variables encrypted with a secrets backend like HashiCorp Vault.
Attempts:
2 left
💡 Hint

Think about security and automation best practices.