0
0
LangchainHow-ToBeginner ยท 4 min read

How to Use Langchain with Streamlit for Interactive AI Apps

To use langchain with streamlit, import Langchain components like LLMChain and create a Streamlit interface to interact with the language model. Use Streamlit widgets to get user input and display Langchain's AI-generated responses dynamically.
๐Ÿ“

Syntax

Here is the basic syntax to connect Langchain with Streamlit:

  • from langchain.llms import OpenAI: Import the language model.
  • from langchain.chains import LLMChain: Import the chain to run prompts.
  • import streamlit as st: Import Streamlit for UI.
  • Create an LLM instance, then an LLMChain with a prompt template.
  • Use Streamlit input widgets to get user text.
  • Call the chain with the input and display the output with Streamlit.
python
from langchain.llms import OpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
import streamlit as st

# Initialize the language model
llm = OpenAI(temperature=0.7)

# Define a prompt template
prompt = PromptTemplate(
    input_variables=["question"],
    template="Answer this question: {question}"
)

# Create a chain
chain = LLMChain(llm=llm, prompt=prompt)

# Streamlit UI
st.title("Langchain with Streamlit")
user_input = st.text_input("Ask a question:")

if user_input:
    response = chain.run(user_input)
    st.write(response)
๐Ÿ’ป

Example

This example shows a simple Streamlit app that uses Langchain's OpenAI LLM to answer user questions interactively.

Users type a question, and the app displays the AI's answer below.

python
from langchain.llms import OpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
import streamlit as st

# Setup OpenAI LLM
llm = OpenAI(temperature=0)

# Prompt template for question answering
prompt = PromptTemplate(
    input_variables=["question"],
    template="You are a helpful assistant. Answer this: {question}"
)

# Create the chain
chain = LLMChain(llm=llm, prompt=prompt)

# Streamlit app
st.title("Ask Langchain")
question = st.text_input("Enter your question here")

if question:
    answer = chain.run(question)
    st.write("**Answer:**", answer)
Output
A Streamlit web app with a text input box labeled 'Enter your question here'. When a question is typed and submitted, the AI-generated answer appears below labeled 'Answer:'.
โš ๏ธ

Common Pitfalls

  • Not setting up API keys for OpenAI or other LLMs causes errors.
  • Forgetting to install required packages: langchain and streamlit.
  • Running Streamlit without the command streamlit run app.py will not launch the app.
  • Using blocking calls inside Streamlit without caching or async can slow UI responsiveness.
  • Not handling empty input can cause the chain to run unnecessarily.
python
import streamlit as st

# Wrong: No input check
question = st.text_input("Ask:")
answer = chain.run(question)  # Runs even if question is empty
st.write(answer)

# Right: Check input before running
question = st.text_input("Ask:")
if question:
    answer = chain.run(question)
    st.write(answer)
๐Ÿ“Š

Quick Reference

StepDescription
Install packagespip install langchain streamlit openai
Set API keyexport OPENAI_API_KEY='your_key' or set in code
Import modulesfrom langchain.llms import OpenAI, import streamlit as st
Create LLM and Chainllm = OpenAI(); chain = LLMChain(llm=llm, prompt=prompt)
Build Streamlit UIUse st.text_input and st.write to interact
Run appstreamlit run your_app.py
โœ…

Key Takeaways

Use Langchain's LLMChain with Streamlit input/output widgets for interactive AI apps.
Always check user input before calling the Langchain chain to avoid errors.
Set your OpenAI API key properly to enable Langchain's OpenAI LLM usage.
Run your Streamlit app with 'streamlit run' command to see the interface.
Install all required packages and keep Langchain and Streamlit updated.