How to Use OpenAI Function Calling for Agent Integration
Use
function_call in OpenAI API requests to let the agent call predefined functions by specifying their name and parameters. Define functions with their schemas, then handle the agent's function call response to execute the function and provide results back to the model.Syntax
The OpenAI function calling syntax involves defining functions with their names, descriptions, and parameter schemas, then including them in the functions field of the API request. The model can then respond with a function_call object specifying which function to call and with what arguments.
Key parts:
functions: List of function definitions with JSON schema for parameters.function_call: Controls if the model should call a function automatically or manually.messages: Conversation messages including user input and assistant responses.
python
functions = [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state, e.g. San Francisco, CA"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
]
response = openai.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "What's the weather like in Boston?"}],
functions=functions,
function_call="auto"
)Example
This example shows how to define a weather function, send a user message, let the model call the function, then execute the function and send the result back to the model for a final answer.
python
import openai import json openai.api_key = "YOUR_API_KEY" def get_current_weather(location, unit="celsius"): # Dummy weather data for demonstration return {"location": location, "temperature": 22, "unit": unit, "forecast": "sunny"} functions = [ { "name": "get_current_weather", "description": "Get the current weather in a given location", "parameters": { "type": "object", "properties": { "location": {"type": "string", "description": "City and state, e.g. Boston, MA"}, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]} }, "required": ["location"] } } ] messages = [ {"role": "user", "content": "What's the weather like in Boston?"} ] response = openai.chat.completions.create( model="gpt-4o-mini", messages=messages, functions=functions, function_call="auto" ) message = response.choices[0].message if message.get("function_call"): function_name = message["function_call"]["name"] arguments = json.loads(message["function_call"]["arguments"]) if function_name == "get_current_weather": weather = get_current_weather(**arguments) messages.append(message) # assistant message with function call messages.append({"role": "function", "name": function_name, "content": json.dumps(weather)}) second_response = openai.chat.completions.create( model="gpt-4o-mini", messages=messages ) print(second_response.choices[0].message["content"])
Output
The current weather in Boston is 22 degrees celsius with sunny skies.
Common Pitfalls
- Not defining the function schema correctly causes the model to ignore function calls.
- Forgetting to parse
function_call.argumentsas JSON leads to errors. - Not sending the function response back to the model prevents it from generating a final answer.
- Setting
function_callto "none" disables automatic function calling.
python
## Wrong: Missing required parameter in schema functions = [ { "name": "get_current_weather", "description": "Get weather", "parameters": { "type": "object", "properties": { "unit": {"type": "string"} }, "required": [] # location missing here } } ] ## Right: Include required parameters functions = [ { "name": "get_current_weather", "description": "Get weather", "parameters": { "type": "object", "properties": { "location": {"type": "string"}, "unit": {"type": "string"} }, "required": ["location"] } } ]
Quick Reference
- functions: List your callable functions with JSON schema.
- function_call: Use "auto" to let the model decide when to call functions.
- Parse arguments: Always parse
function_call.argumentsas JSON before use. - Send function results: Add a message with role
functionand the result content back to the conversation.
Key Takeaways
Define functions with clear JSON schemas to enable OpenAI to call them correctly.
Set function_call to "auto" to let the model trigger function calls based on user input.
Parse the function_call arguments as JSON before executing the function.
Send the function's output back to the model as a function role message for a complete response.
Check your function schemas carefully to avoid missing required parameters.