0
0
LangChainframework~15 mins

Setting up LangSmith tracing in LangChain - Try It Yourself

Choose your learning style9 modes available
Setting up LangSmith tracing
📖 Scenario: You are building a simple Python program using LangChain to run a language model. You want to track and trace the calls your program makes to the language model using LangSmith tracing. This helps you see what inputs and outputs your program produces.
🎯 Goal: Set up LangSmith tracing in your LangChain program to automatically record all language model calls.
📋 What You'll Learn
Create a LangChain OpenAI language model instance
Create a LangSmith tracer instance
Attach the LangSmith tracer to the language model
Run the language model with a prompt and have tracing enabled
💡 Why This Matters
🌍 Real World
Tracing language model calls helps developers debug and understand how their AI programs behave. LangSmith tracing records inputs, outputs, and metadata for each call.
💼 Career
Many AI developer roles require setting up monitoring and tracing for language model applications to ensure reliability and improve performance.
Progress0 / 4 steps
1
Create the OpenAI language model instance
Write code to import OpenAI from langchain.llms and create a variable called llm that is an instance of OpenAI with temperature=0.
LangChain
Need a hint?

Use from langchain.llms import OpenAI to import the model, then create llm = OpenAI(temperature=0).

2
Create the LangSmith tracer instance
Write code to import LangSmithTracer from langchain_experimental.langsmith and create a variable called tracer that is an instance of LangSmithTracer.
LangChain
Need a hint?

Import LangSmithTracer and create it with tracer = LangSmithTracer().

3
Attach the tracer to the language model
Write code to set the llm instance's callback_manager to the tracer instance.
LangChain
Need a hint?

Assign llm.callback_manager = CallbackManager([tracer]) to enable tracing.

4
Run the language model with tracing enabled
Write code to call llm with the prompt "Hello, LangSmith!" and assign the result to a variable called response.
LangChain
Need a hint?

Call llm("Hello, LangSmith!") and save it to response.