0
0
LangChainframework~30 mins

LangChain vs direct API calls - Hands-On Comparison

Choose your learning style9 modes available
LangChain vs Direct API Calls
📖 Scenario: You want to build a simple program that asks a question to an AI language model. You will first set up the question, then configure the API key, then write code to call the AI model directly using the OpenAI API, and finally write code to call the AI model using LangChain. This will help you see the difference between using LangChain and direct API calls.
🎯 Goal: Build a Python script that sends a question to an AI language model in two ways: first by calling the OpenAI API directly, and second by using LangChain's OpenAI wrapper. You will compare how the code looks and works.
📋 What You'll Learn
Create a variable called question with the exact string: 'What is the capital of France?'
Create a variable called api_key with the exact string: 'test-api-key'
Write code to call the OpenAI API directly using openai.ChatCompletion.create with the question and api_key
Write code to call the OpenAI model using LangChain's ChatOpenAI class with the question and api_key
💡 Why This Matters
🌍 Real World
Developers often need to interact with AI models. They can call APIs directly or use frameworks like LangChain to simplify and organize their code.
💼 Career
Understanding both direct API calls and using frameworks like LangChain is valuable for AI developers, data scientists, and software engineers working with language models.
Progress0 / 4 steps
1
DATA SETUP: Create the question variable
Create a variable called question and set it to the string 'What is the capital of France?'
LangChain
Need a hint?

Use a simple assignment like question = 'What is the capital of France?'

2
CONFIGURATION: Add the API key variable
Create a variable called api_key and set it to the string 'test-api-key'
LangChain
Need a hint?

Use a simple assignment like api_key = 'test-api-key'

3
CORE LOGIC: Call OpenAI API directly
Write code to call the OpenAI API directly using openai.ChatCompletion.create with the question and api_key. Assign the response to a variable called direct_response. Use the model 'gpt-4' and pass the message as [{'role': 'user', 'content': question}]. Set the API key using openai.api_key = api_key.
LangChain
Need a hint?

Remember to import openai and set openai.api_key = api_key before calling openai.ChatCompletion.create.

4
COMPLETION: Call OpenAI model using LangChain
Write code to call the OpenAI model using LangChain's ChatOpenAI class. Import ChatOpenAI from langchain.chat_models. Create an instance called llm with model_name='gpt-4' and openai_api_key=api_key. Then call llm(question) with the question and assign the result to langchain_response.
LangChain
Need a hint?

Import ChatOpenAI from langchain.chat_models. Create llm with the API key and model name. Use llm(question) to get the response.