0
0
LangChainframework~10 mins

LangChain ecosystem (LangSmith, LangGraph, LangServe)

Choose your learning style9 modes available
Introduction

The LangChain ecosystem helps you build, test, and run language AI apps easily. It gives tools to track your work, visualize how your app thinks, and serve your app to users.

You want to keep track of how your language AI app performs and debug it.
You want to see a visual map of how your app processes language tasks step-by-step.
You want to quickly share your language AI app with others through an easy web interface.
You want to improve your app by analyzing past runs and data.
You want to build and deploy language AI apps without managing complex infrastructure.
Syntax
LangChain
from langsmith import Client as LangSmith
from langgraph.graph import StateGraph as LangGraph
from langserve import add_routes
from fastapi import FastAPI

# LangSmith for tracking
langsmith = LangSmith(api_key="your_api_key")

# LangGraph for building graphs
langgraph = LangGraph()

# LangServe for deployment
app = FastAPI()
add_routes(app, chain, path="/chain")

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(app, port=8000)

LangSmith: Use the Client for tracing runs.

LangGraph: Build stateful multi-actor apps with graphs.

LangServe: Deploy chains as REST APIs using FastAPI.

Install with: pip install langsmith langgraph langserve fastapi uvicorn

Examples
Using @traceable decorator from LangSmith to automatically track function runs.
LangChain
from langsmith import traceable

@traceable
def my_function(input):
    return input.upper()

result = my_function("hello")
Building a simple LangGraph workflow.
LangChain
from langgraph.graph import StateGraph, END

graph = StateGraph(state_schema=dict)
graph.add_node("node", func)
graph.set_entry_point("node")
graph.add_edge("node", END)
app = graph.compile()
Deploying a LangChain runnable with LangServe.
LangChain
from langserve import add_routes
from fastapi import FastAPI
from langchain_core.prompts import ChatPromptTemplate

app = FastAPI()
chain = ChatPromptTemplate.from_template("Tell me about {topic}")
add_routes(app, chain, path="/template")
# Run with: uvicorn main:app --reload
Sample Program

Demonstrates traceable chain with LangSmith, simple LangGraph app. Note: Requires OPENAI_API_KEY and LANGCHAIN_API_KEY env vars for tracing. LangServe setup shown in syntax.

LangChain
from langsmith import traceable
from langgraph.graph import StateGraph, END
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

@traceable
class SimpleChain:
    def __init__(self):
        self.llm = ChatOpenAI(model_name="gpt-3.5-turbo")
        self.prompt = ChatPromptTemplate.from_template("Uppercase: {input}")
        self.chain = self.prompt | self.llm

    def run(self, input_text):
        return self.chain.invoke({"input": input_text}).content

# LangGraph example (simple)
graph = StateGraph(str)
graph.add_node("uppercase", lambda x: x.upper())
graph.set_entry_point("uppercase")
graph.add_edge("uppercase", END)
app = graph.compile()

simple_chain = SimpleChain()
input_text = "hello world"
output_text = simple_chain.run(input_text)
print(f"Input: {input_text}")
print(f"Output: {output_text}")
print(app.invoke("hello langgraph"))
OutputSuccess
Important Notes

LangSmith: Observability platform for tracing, testing, monitoring LLM apps.

LangGraph: Framework for building reliable, stateful AI agents with cycles, branching.

LangServe: Makes it easy to serve LangChain runnables as REST APIs.

All integrate seamlessly with core LangChain.

Summary

LangSmith: Tracing, datasets, evals for LLM apps.

LangGraph: Graphs for complex, cyclical agent workflows.

LangServe: Production-ready deployment of chains/runnables.