0
0
LangChainframework~5 mins

Setting up LangSmith tracing in LangChain

Choose your learning style9 modes available
Introduction

LangSmith tracing helps you watch what your LangChain app does step-by-step. It records details so you can understand and fix your app easily.

You want to see how your LangChain app processes inputs and outputs.
You need to find bugs or unexpected behavior in your LangChain workflows.
You want to save detailed logs of your app's actions for later review.
You are building a complex app and want to track each step clearly.
Syntax
LangChain
import os
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = "lsv2_your-key-here"

# Use your LangChain components as usual - tracing is automatic
from langchain.chat_models import ChatOpenAI

llm = ChatOpenAI()

Set LANGCHAIN_TRACING_V2="true" and LANGCHAIN_API_KEY environment variables to start tracing.

LangChain components automatically send traces to LangSmith when env vars are set.

Examples
This example sets up tracing and runs a simple prompt with tracing enabled.
LangChain
import os
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = "lsv2_your-key-here"
from langchain.chat_models import ChatOpenAI

llm = ChatOpenAI()
response = llm.invoke("Hello!")
Shows how to use tracing with a chain to trace the whole workflow automatically.
LangChain
import os
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = "lsv2_your-key-here"
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain

llm = ChatOpenAI()
prompt = PromptTemplate.from_template("Echo: {input}")
chain = LLMChain(llm=llm, prompt=prompt)
result = chain.invoke({"input": "Input text"})["text"]
Sample Program

This program sets up LangSmith tracing for a ChatOpenAI LLM by configuring environment variables. It runs a prompt asking for the capital of France and prints the response. View the trace in the LangSmith dashboard.

LangChain
import os
from langchain.chat_models import ChatOpenAI

# Set environment variables for LangSmith tracing
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = "lsv2_your-key-here"

# Create a ChatOpenAI LLM with tracing enabled automatically
llm = ChatOpenAI()

# Run a prompt
response = llm.invoke("What is the capital of France?")

print("Response:", response.content)
OutputSuccess
Important Notes

Make sure you have langchain, langsmith, and openai installed. Get your API key from LangSmith.

Tracing can slow down your app slightly because it records extra details.

You can view traces in LangSmith's dashboard or export them for analysis.

Summary

LangSmith tracing helps you watch your LangChain app's steps clearly.

Enable tracing by setting LANGCHAIN_TRACING_V2="true" and LANGCHAIN_API_KEY.

Use tracing to debug, understand, and improve your LangChain workflows.