0
0
LangChainframework~10 mins

Setting up LangSmith tracing in LangChain - Visual Walkthrough

Choose your learning style9 modes available
Concept Flow - Setting up LangSmith tracing
Import LangSmithTracer
Create tracer instance
Configure LangChain to use tracer
Run LangChain operations
Tracer collects and sends data
View traces in LangSmith UI
This flow shows how to import, create, and configure LangSmith tracing, then run LangChain code that sends trace data to LangSmith.
Execution Sample
LangChain
from langchain import OpenAI, LLMChain
from langchain_experimental.langsmith import LangSmithTracer

tracer = LangSmithTracer()
llm = OpenAI()
llm.chain_type = "stuff"
llm.tracer = tracer

chain = LLMChain(llm=llm, prompt=some_prompt)
result = chain.run("Hello")
This code sets up LangSmithTracer, attaches it to an OpenAI LLM, then runs a chain to generate output while tracing.
Execution Table
StepActionEvaluationResult
1Import LangSmithTracerModule foundLangSmithTracer class ready
2Create tracer instancetracer = LangSmithTracer()tracer object created
3Create OpenAI LLM instancellm = OpenAI()llm object created
4Attach tracer to llmllm.tracer = tracerllm now sends traces
5Create LLMChain with llmchain = LLMChain(llm=llm, prompt=some_prompt)chain ready to run
6Run chain with input 'Hello'chain.run('Hello')Output generated, trace sent
7Tracer collects dataTrace data capturedData sent to LangSmith UI
8View tracesOpen LangSmith UITrace visible for analysis
💡 Tracing setup complete and data sent after chain run
Variable Tracker
VariableStartAfter Step 2After Step 3After Step 4After Step 5After Step 6Final
tracerundefinedLangSmithTracer instanceLangSmithTracer instanceLangSmithTracer instanceLangSmithTracer instanceLangSmithTracer instanceLangSmithTracer instance
llmundefinedundefinedOpenAI instanceOpenAI instance with tracerOpenAI instance with tracerOpenAI instance with tracerOpenAI instance with tracer
chainundefinedundefinedundefinedundefinedLLMChain instanceLLMChain instanceLLMChain instance
resultundefinedundefinedundefinedundefinedundefinedOutput stringOutput string
Key Moments - 2 Insights
Why do we attach the tracer to the LLM instead of the chain?
The tracer is attached to the LLM because it captures the calls and responses of the language model itself. The chain uses the LLM, so tracing the LLM covers the core operations. This is shown in execution_table step 4 where llm.tracer is set.
What happens if we run the chain without setting the tracer?
If the tracer is not set on the LLM, no trace data is collected or sent. The chain still runs and produces output, but tracing is inactive. This is implied by the absence of tracer in variable_tracker before step 4.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table at step 6, what happens when chain.run('Hello') is called?
AThe OpenAI LLM is instantiated
BThe tracer is created
CThe chain generates output and sends trace data
DThe prompt is defined
💡 Hint
Refer to execution_table row with Step 6 describing chain.run action and result
According to variable_tracker, when does the 'tracer' variable get its instance value?
AAfter Step 2
BAfter Step 3
CAfter Step 5
DAfter Step 6
💡 Hint
Check variable_tracker row for 'tracer' and see when it changes from undefined
If we did not assign llm.tracer = tracer, what would change in the execution?
AThe chain would fail to run
BTrace data would not be collected during chain.run
CThe tracer would not be created
DThe prompt would not be set
💡 Hint
Look at key_moments explaining the importance of attaching tracer to llm
Concept Snapshot
Setting up LangSmith tracing:
1. Import LangSmithTracer
2. Create tracer instance
3. Attach tracer to your LLM (e.g., OpenAI)
4. Run your LangChain code
5. Tracer collects and sends trace data
6. View traces in LangSmith UI
Full Transcript
To set up LangSmith tracing, first import the LangSmithTracer class. Then create an instance of it. Next, create your language model object, such as OpenAI. Attach the tracer instance to the llm by setting llm.tracer = tracer. After this, create your LangChain chain using the llm. When you run the chain with input, the tracer collects data about the calls and responses. This data is sent to the LangSmith UI where you can view detailed traces of your language model usage. Without attaching the tracer to the llm, no trace data is collected even if the chain runs successfully.