0
0
LangChainframework~10 mins

Streaming in production in LangChain - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to enable streaming output in a LangChain LLM call.

LangChain
from langchain.llms import OpenAI
llm = OpenAI(streaming=[1])
response = llm("Hello, how are you?")
Drag options to blanks, or click blank then click option'
ANone
BFalse
C"yes"
DTrue
Attempts:
3 left
💡 Hint
Common Mistakes
Using a string instead of a boolean for streaming parameter.
Leaving streaming as False or None, which disables streaming.
2fill in blank
medium

Complete the code to define a callback handler for streaming tokens in LangChain.

LangChain
from langchain.callbacks.base import BaseCallbackHandler

class StreamHandler(BaseCallbackHandler):
    def on_llm_new_token(self, token: str, **kwargs):
        print([1])
Drag options to blanks, or click blank then click option'
Akwargs
Bself
Ctoken
DNone
Attempts:
3 left
💡 Hint
Common Mistakes
Printing self or kwargs instead of the token string.
Forgetting to print anything inside the method.
3fill in blank
hard

Fix the error in attaching the streaming callback handler to the OpenAI LLM.

LangChain
llm = OpenAI(streaming=True, callbacks=[[1]])
Drag options to blanks, or click blank then click option'
AStreamHandler()
BStreamHandler
CStreamHandler.on_llm_new_token
DStreamHandler.callback
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the class name without parentheses.
Passing a method instead of an instance.
4fill in blank
hard

Fill both blanks to create a streaming LLM with a callback handler and run a prompt.

LangChain
llm = OpenAI(streaming=[1], callbacks=[[2]])
result = llm("Tell me a joke.")
Drag options to blanks, or click blank then click option'
ATrue
BFalse
CStreamHandler()
DStreamHandler
Attempts:
3 left
💡 Hint
Common Mistakes
Setting streaming to False disables streaming.
Passing the class name instead of an instance for callbacks.
5fill in blank
hard

Fill all three blanks to define a streaming callback handler that collects tokens, then print the full response.

LangChain
class CollectHandler(BaseCallbackHandler):
    def __init__(self):
        self.text = ""
    def on_llm_new_token(self, token: str, **kwargs):
        self.text += [1]

handler = CollectHandler()
llm = OpenAI(streaming=[2], callbacks=[[3]])
response = llm("Say something nice.")
print(handler.text)
Drag options to blanks, or click blank then click option'
Atoken
B" "
Chandler
DTrue
Attempts:
3 left
💡 Hint
Common Mistakes
Adding a space string instead of the token.
Passing the class name instead of the instance as callback.
Setting streaming to False.