Challenge - 5 Problems
LangChain Debugging Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🔧 Debug
intermediate2:00remaining
Identify the error in this LangChain chain execution
Given the following LangChain code snippet, what error will it raise when executed?
LangChain
from langchain.chains import SimpleSequentialChain, LLMChain from langchain.llms import OpenAI from langchain.prompts import PromptTemplate llm = OpenAI(temperature=0) prompt1 = PromptTemplate.from_template("Repeat: {input}") chain1 = LLMChain(llm=llm, prompt=prompt1) prompt2 = PromptTemplate.from_template("Again: {text}") chain2 = LLMChain(llm=llm, prompt=prompt2) final_chain = SimpleSequentialChain(chains=[chain1, chain2]) output = final_chain.run("Hello")
Attempts:
2 left
💡 Hint
Check the constructor parameters for SimpleSequentialChain in LangChain.
✗ Incorrect
SimpleSequentialChain does not accept a 'llm' parameter in its constructor. Passing 'llm=llm' causes a TypeError.
❓ component_behavior
intermediate1:30remaining
What happens when a LangChain chain input key is missing?
Consider a LangChain chain expecting an input key 'question'. What happens if you run the chain with input {'query': 'What is AI?'} instead?
Attempts:
2 left
💡 Hint
Chains require specific input keys to function properly.
✗ Incorrect
LangChain chains expect specific input keys. Missing required keys causes a KeyError during execution.
❓ state_output
advanced2:00remaining
What is the output of this LangChain chain with a failing LLM call?
Given a LangChain chain that calls an LLM which raises an exception during execution, what will be the chain's output?
LangChain
from langchain.llms.base import LLM class FailingLLM(LLM): def _call(self, prompt, stop=None): raise RuntimeError('LLM failure') @property def _identifying_params(self): return {} @property def _llm_type(self): return 'failing' failing_llm = FailingLLM() from langchain.chains import LLMChain from langchain.prompts import PromptTemplate prompt = PromptTemplate(input_variables=['text'], template='Echo: {text}') chain = LLMChain(llm=failing_llm, prompt=prompt) try: output = chain.run('test') except Exception as e: output = e.args[0]
Attempts:
2 left
💡 Hint
Consider what happens when the LLM call raises an exception inside the chain.
✗ Incorrect
The LLM call raises a RuntimeError with message 'LLM failure'. The try-except captures it and sets output to the error message string.
📝 Syntax
advanced1:30remaining
Which option correctly initializes a LangChain SequentialChain with two chains?
You want to create a SequentialChain that runs two chains in order. Which code snippet is correct?
Attempts:
2 left
💡 Hint
Check the parameter names and types in SequentialChain constructor.
✗ Incorrect
SequentialChain expects a 'chains' parameter as a list and 'input_variables' and 'output_variables' as lists of strings. Option B matches this.
🧠 Conceptual
expert2:30remaining
Why does a LangChain chain fail silently when using an incorrect input key mapping?
You have a chain expecting input key 'question' but you pass {'query': 'Hello'}. The chain runs but returns an empty string without error. Why?
Attempts:
2 left
💡 Hint
Consider how prompt templates use input variables and what happens if they don't match inputs.
✗ Incorrect
If the prompt template references a variable name not provided in inputs, it substitutes empty string, causing empty output without error.