0
0
LangChainframework~10 mins

Streaming responses in LangChain - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to enable streaming in the LangChain LLM call.

LangChain
llm = OpenAI(streaming=[1])
Drag options to blanks, or click blank then click option'
ATrue
BFalse
C"yes"
DNone
Attempts:
3 left
💡 Hint
Common Mistakes
Using a string like 'yes' instead of a boolean True.
Leaving streaming as False which disables streaming.
2fill in blank
medium

Complete the code to define a callback handler for streaming tokens.

LangChain
class StreamHandler(BaseCallbackHandler):
    def on_llm_new_token(self, token: str, **kwargs):
        print([1])
Drag options to blanks, or click blank then click option'
Akwargs
Btoken
Cself
DNone
Attempts:
3 left
💡 Hint
Common Mistakes
Printing kwargs instead of the token string.
Printing self which is the class instance.
3fill in blank
hard

Fix the error in attaching the callback handler to the LLM for streaming.

LangChain
llm = OpenAI(streaming=True, callbacks=[[1]])
Drag options to blanks, or click blank then click option'
AStreamHandler.on_llm_new_token
BStreamHandler
CStreamHandler.callback
DStreamHandler()
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the class name without parentheses.
Passing a method instead of an instance.
4fill in blank
hard

Fill both blanks to create a streaming chain with a prompt and callback handler.

LangChain
chain = LLMChain(llm=llm, prompt=[1], callbacks=[[2]])
Drag options to blanks, or click blank then click option'
Aprompt_template
BStreamHandler()
Cllm
Dcallback_handler
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the LLM object as prompt.
Passing the callback handler class instead of an instance.
5fill in blank
hard

Fill all three blanks to start streaming and handle tokens with a callback.

LangChain
llm = OpenAI(streaming=[1], callbacks=[[2]])
chain = LLMChain(llm=llm, prompt=[3])
Drag options to blanks, or click blank then click option'
AFalse
BStreamHandler()
Cprompt_template
DTrue
Attempts:
3 left
💡 Hint
Common Mistakes
Setting streaming to False disables streaming.
Passing the callback class instead of an instance.
Passing the LLM object as prompt.