0
0
Agentic AIml~12 mins

Function calling in LLMs in Agentic AI - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - Function calling in LLMs

This pipeline shows how a large language model (LLM) uses function calling to improve responses by invoking external functions during text generation.

Data Flow - 4 Stages
1User Input
1 text stringReceive user question or command1 text string
"What is the weather in Paris today?"
2LLM Processing
1 text stringAnalyze input and decide if function call is needed1 text string or function call request
"Call weather_api(city='Paris')"
3Function Call Execution
Function call request with parametersInvoke external function (e.g., API) and get resultFunction output data
{"temperature": "18°C", "condition": "Cloudy"}
4LLM Response Generation
Function output dataIncorporate function output into final text response1 text string
"The weather in Paris today is 18°C and cloudy."
Training Trace - Epoch by Epoch
Loss
1.0 |\
0.9 | \
0.8 |  \
0.7 |   \
0.6 |    \
0.5 |     \
0.4 |      \
0.3 |       \
0.2 |        \
0.1 |         \
0.0 +----------
      1 2 3 4 5
      Epochs
EpochLoss ↓Accuracy ↑Observation
10.850.60Model starts learning to detect when to call functions.
20.650.72Improved accuracy in predicting function calls.
30.500.81Better integration of function outputs in responses.
40.380.88Model effectively calls functions and generates accurate answers.
50.300.92Training converges with high accuracy and low loss.
Prediction Trace - 4 Layers
Layer 1: Input Text
Layer 2: Function Call Decision
Layer 3: Function Execution
Layer 4: Response Generation
Model Quiz - 3 Questions
Test your understanding
What does the LLM do after receiving the user input?
ADecides if a function call is needed
BImmediately calls an external function
CReturns a random answer
DIgnores the input
Key Insight
Function calling lets LLMs get real-time data or perform tasks by invoking external functions, making their answers more accurate and useful.