0
0
Agentic AIml~20 mins

Function calling in LLMs in Agentic AI - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Function Calling Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
How does function calling improve LLM responses?

Large Language Models (LLMs) can call external functions during a conversation. What is the main benefit of this feature?

AIt prevents the LLM from making any mistakes in grammar.
BIt allows the LLM to access real-time or external data beyond its training.
CIt makes the LLM generate longer text outputs automatically.
DIt reduces the model size by offloading computations to the function.
Attempts:
2 left
💡 Hint

Think about how LLMs can get information they don't know internally.

Predict Output
intermediate
1:30remaining
Output of LLM function call simulation

Given the following Python simulation of an LLM calling a function, what is the printed output?

Agentic AI
def get_temperature(city):
    return f"The temperature in {city} is 22°C."

user_input = "What's the temperature in Paris?"

# Simulate LLM detecting function call
if "temperature" in user_input:
    city = user_input.split()[-1].strip('?')
    response = get_temperature(city)
else:
    response = "I don't know."

print(response)
AThe temperature in Paris is 22°C.
BI don't know.
CThe temperature in Paris is 22F.
DSyntaxError
Attempts:
2 left
💡 Hint

Look at how the city name is extracted and passed to the function.

Model Choice
advanced
2:00remaining
Choosing the best LLM model for function calling

You want to build a chatbot that uses function calling to fetch live stock prices. Which model type is best suited for this?

AA large LLM trained only on static text data without function calling.
BA small LLM without function calling support but fine-tuned on finance data.
CA rule-based chatbot with no machine learning.
DA large LLM with built-in function calling and API integration capabilities.
Attempts:
2 left
💡 Hint

Consider which model can dynamically call external APIs for live data.

Hyperparameter
advanced
1:30remaining
Effect of temperature setting on function calling outputs

When using an LLM with function calling, how does increasing the temperature hyperparameter affect the function call behavior?

AIt makes the LLM more likely to call functions randomly, even when not needed.
BIt makes the LLM more deterministic and less creative in calling functions.
CIt has no effect on function calling decisions, only on text generation style.
DIt disables function calling entirely.
Attempts:
2 left
💡 Hint

Think about how temperature affects randomness in LLM outputs.

🔧 Debug
expert
2:30remaining
Why does this LLM function call fail to execute?

Consider this snippet where an LLM is supposed to call a function to get current time, but it never executes the function. What is the likely cause?

Agentic AI
def get_current_time():
    from datetime import datetime
    return datetime.now().strftime('%H:%M:%S')

llm_response = '{"function_call": {"name": "get_current_time"}}'

# Code to parse and execute function call
import json
response_dict = json.loads(llm_response)

if response_dict.get('function_call'):
    func_name = response_dict['function_call']['name']
    if func_name == 'get_current_time':
        result = get_current_time()
    else:
        result = None
else:
    result = None

print(result)
AThe JSON string is malformed and causes a parsing error.
BThe function get_current_time is not defined in the code.
CThe function get_current_time is assigned but not called, so result is the function object, not its output.
DThe print statement is missing parentheses causing a syntax error.
Attempts:
2 left
💡 Hint

Check if the function is called or just referenced.