0
0
LangChainframework~20 mins

Setting up LangSmith tracing in LangChain - Practice Exercises

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
LangSmith Tracing Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
component_behavior
intermediate
2:00remaining
What is the output when LangSmith tracing is enabled with a LangChain LLM?

Consider the following Python code snippet using LangChain with LangSmith tracing enabled. What will be the behavior or output when the LLM is called?

LangChain
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage
from langchain_experimental.langsmith import LangSmithTracer

tracer = LangSmithTracer()
tracer.start()

llm = ChatOpenAI(model_name="gpt-4", temperature=0)
response = llm.invoke([HumanMessage(content="Hello, LangSmith!")])

tracer.end()
AThe code runs but the tracer.start() call disables the LLM from generating any response.
BThe code raises a runtime error because LangSmithTracer requires additional configuration before start().
CThe LLM response is printed to the console but no tracing data is recorded or sent.
DThe LLM generates a response and the interaction is recorded and sent to LangSmith tracing dashboard.
Attempts:
2 left
💡 Hint

Think about what enabling tracing with LangSmithTracer does in LangChain.

📝 Syntax
intermediate
1:30remaining
Which option correctly initializes LangSmithTracer for tracing?

Choose the correct way to create a LangSmithTracer instance for tracing LangChain calls.

Atracer = LangSmithTracer.init()
Btracer = LangSmithTracer()
Ctracer = LangSmithTracer.start()
Dtracer = LangSmithTracer(enable=True)
Attempts:
2 left
💡 Hint

Look for the constructor method to create an instance.

🔧 Debug
advanced
2:30remaining
Why does this LangSmith tracing code fail to record traces?

Given the code below, why are no traces recorded in LangSmith?

LangChain
from langchain_experimental.langsmith import LangSmithTracer
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage

tracer = LangSmithTracer()

llm = ChatOpenAI(model_name="gpt-4")
response = llm.invoke([HumanMessage(content="Trace this")])

tracer.end()
ABecause tracer.start() was never called to activate tracing.
BBecause LangSmithTracer requires an API key argument during initialization.
CBecause the LLM model_name is invalid and causes silent failure.
DBecause tracer.end() must be called before invoking the LLM.
Attempts:
2 left
💡 Hint

Check the tracer lifecycle methods needed to enable tracing.

state_output
advanced
2:00remaining
What is the state of LangSmithTracer after calling start() and end()?

After running the following code, what is the state of the tracer object?

LangChain
tracer = LangSmithTracer()
tracer.start()
# some LLM calls
tracer.end()
AThe tracer is destroyed and cannot be restarted.
BThe tracer remains active and continues recording traces.
CThe tracer is inactive and no longer recording traces.
DThe tracer resets and deletes all recorded trace data.
Attempts:
2 left
💡 Hint

Think about what end() means in a start/end lifecycle.

🧠 Conceptual
expert
3:00remaining
Which statement best describes LangSmith tracing integration with LangChain?

Choose the most accurate description of how LangSmith tracing works with LangChain.

ALangSmithTracer hooks into LangChain's internal call stack to capture inputs, outputs, and metadata automatically during LLM calls.
BLangSmithTracer requires manual instrumentation of every LangChain component to record traces.
CLangSmithTracer replaces LangChain's LLM classes with its own versions to enable tracing.
DLangSmithTracer only records errors and exceptions raised by LangChain components.
Attempts:
2 left
💡 Hint

Consider how tracing tools usually integrate with libraries to capture data.