LangChain - LLM and Chat Model Integration
You wrote this code but get no streaming output:
llm = OpenAI()
llm("Tell me a joke.")
What is the likely fix?llm = OpenAI()
llm("Tell me a joke.")
What is the likely fix?streaming=True when creating the LLM enables streaming output.15+ quiz questions · All difficulty levels · Free
Free Signup - Practice All Questions