Bird
0
0

You wrote this code but no tracing data appears in LangSmith dashboard. What is the likely error?

medium📝 Debug Q6 of 15
LangChain - LangSmith Observability
You wrote this code but no tracing data appears in LangSmith dashboard. What is the likely error?
from langchain.llms import OpenAI
from langchain.callbacks import LangSmithTracer
tracer = LangSmithTracer()
llm = OpenAI()
llm('Hello', callbacks=[tracer])
AYou must call tracer.start() before using it.
BCallbacks should be passed when creating the LLM, not during the call.
CThe OpenAI class does not support callbacks at all.
DLangSmithTracer must be imported from langchain.tracing, not callbacks.
Step-by-Step Solution
Solution:
  1. Step 1: Check how callbacks are passed to LLM

    Callbacks should be set when creating the LLM instance, not during the call method.
  2. Step 2: Identify the error in code usage

    Passing callbacks in llm('Hello', callbacks=[tracer]) is ignored; correct usage is llm = OpenAI(callbacks=[tracer]).
  3. Final Answer:

    Callbacks should be passed when creating the LLM, not during the call. -> Option B
  4. Quick Check:

    Callbacks location matters = Callbacks should be passed when creating the LLM, not during the call. [OK]
Quick Trick: Pass callbacks when creating LLM instance [OK]
Common Mistakes:
MISTAKES
  • Passing callbacks in the call method instead of constructor
  • Importing tracer from wrong module
  • Expecting to manually start tracer

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More LangChain Quizzes