0
0
Agentic AIml~5 mins

Function calling in LLMs in Agentic AI

Choose your learning style9 modes available
Introduction
Function calling helps large language models (LLMs) use external tools or code to get precise answers or perform tasks beyond just text.
When you want the LLM to fetch real-time data like weather or stock prices.
When the LLM needs to perform calculations or access a database.
When you want the LLM to control smart devices or apps by calling specific functions.
When you want to keep answers accurate by letting the LLM ask a trusted function instead of guessing.
When building chatbots that can do tasks like booking tickets or setting reminders.
Syntax
Agentic AI
function_call = {
    "name": "function_name",
    "arguments": {
        "param1": "value1",
        "param2": "value2"
    }
}

response = llm.generate(prompt, function_call=function_call)
The function_call object tells the LLM which function to run and with what inputs.
The LLM can decide when to call a function based on the conversation or prompt.
Examples
This tells the LLM to call the 'get_current_weather' function with the location set to New York.
Agentic AI
function_call = {
    "name": "get_current_weather",
    "arguments": {
        "location": "New York"
    }
}
This instructs the LLM to call a function that sums numbers 5, 10, and 15.
Agentic AI
function_call = {
    "name": "calculate_sum",
    "arguments": {
        "numbers": [5, 10, 15]
    }
}
Sample Model
This simple example shows how an LLM can call a function named 'get_greeting' with a name argument to return a personalized greeting.
Agentic AI
class SimpleLLM:
    def generate(self, prompt, function_call=None):
        if function_call:
            name = function_call.get("name")
            args = function_call.get("arguments", {})
            if name == "get_greeting":
                name_arg = args.get("name", "there")
                return f"Hello, {name_arg}!"
            else:
                return "Function not found."
        return "No function called."

llm = SimpleLLM()

# Call the function 'get_greeting' with argument 'name' = 'Alice'
function_call = {
    "name": "get_greeting",
    "arguments": {
        "name": "Alice"
    }
}

response = llm.generate("Say hello", function_call=function_call)
print(response)
OutputSuccess
Important Notes
Function calling lets LLMs do more than just chat; they can run code or get data.
You need to define the functions the LLM can call and how to handle their inputs and outputs.
Always check that the function names and arguments match what the LLM expects.
Summary
Function calling connects LLMs to external tools or code for better answers.
It works by telling the LLM which function to run and with what inputs.
This helps build smarter apps like chatbots that can do real tasks.