Challenge - 5 Problems
Agent Autonomy Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate2:00remaining
Understanding autonomy levels in agents
Which statement best describes the difference between an autonomous agent and a semi-autonomous agent?
Attempts:
2 left
💡 Hint
Think about how much freedom the agent has to act on its own.
✗ Incorrect
Autonomous agents act on their own without needing human input, while semi-autonomous agents still depend on humans for some decisions.
❓ Model Choice
intermediate2:00remaining
Choosing agent type for a delivery drone
You want to design a delivery drone that can navigate city streets and deliver packages. Which agent type is best if you want the drone to handle unexpected obstacles but still allow human override?
Attempts:
2 left
💡 Hint
Consider safety and flexibility in decision making.
✗ Incorrect
A semi-autonomous agent can handle obstacles on its own but allows humans to intervene if needed, balancing autonomy and control.
❓ Metrics
advanced2:00remaining
Evaluating autonomy in agent performance
Which metric best measures how independently an agent completes tasks without human help?
Attempts:
2 left
💡 Hint
Look for a metric that reflects independence.
✗ Incorrect
The percentage of tasks completed without human intervention directly measures the agent's autonomy level.
🔧 Debug
advanced2:00remaining
Identifying autonomy issue in agent behavior
An autonomous agent designed to navigate a maze keeps stopping and waiting for human input at every turn. What is the most likely cause?
Attempts:
2 left
💡 Hint
If it waits for input, it might not be fully autonomous.
✗ Incorrect
If the decision-making module is off, the agent cannot act independently and waits for human commands.
❓ Predict Output
expert3:00remaining
Output of semi-autonomous agent simulation code
What is the output of this Python code simulating a semi-autonomous agent's decision process?
Agentic AI
class Agent: def __init__(self, autonomy_level): self.autonomy_level = autonomy_level # 0 to 1, where 1 is fully autonomous def decide(self, situation): if self.autonomy_level >= 0.8: return 'Act independently' elif 0.3 <= self.autonomy_level < 0.8: if situation == 'complex': return 'Request human input' else: return 'Act independently' else: return 'Wait for human command' agent = Agent(0.5) print(agent.decide('complex'))
Attempts:
2 left
💡 Hint
Check the autonomy_level and situation conditions carefully.
✗ Incorrect
With autonomy_level 0.5 and situation 'complex', the agent requests human input according to the code logic.