The LangChain ecosystem helps you build, test, and run language AI apps easily. It gives tools to track your work, visualize how your app thinks, and serve your app to users.
LangChain ecosystem (LangSmith, LangGraph, LangServe)
from langsmith import Client as LangSmith from langgraph.graph import StateGraph as LangGraph from langserve import add_routes from fastapi import FastAPI # LangSmith for tracking langsmith = LangSmith(api_key="your_api_key") # LangGraph for building graphs langgraph = LangGraph() # LangServe for deployment app = FastAPI() add_routes(app, chain, path="/chain") if __name__ == "__main__": import uvicorn uvicorn.run(app, port=8000)
LangSmith: Use the Client for tracing runs.
LangGraph: Build stateful multi-actor apps with graphs.
LangServe: Deploy chains as REST APIs using FastAPI.
Install with: pip install langsmith langgraph langserve fastapi uvicorn
from langsmith import traceable @traceable def my_function(input): return input.upper() result = my_function("hello")
from langgraph.graph import StateGraph, END graph = StateGraph(state_schema=dict) graph.add_node("node", func) graph.set_entry_point("node") graph.add_edge("node", END) app = graph.compile()
from langserve import add_routes from fastapi import FastAPI from langchain_core.prompts import ChatPromptTemplate app = FastAPI() chain = ChatPromptTemplate.from_template("Tell me about {topic}") add_routes(app, chain, path="/template") # Run with: uvicorn main:app --reload
Demonstrates traceable chain with LangSmith, simple LangGraph app. Note: Requires OPENAI_API_KEY and LANGCHAIN_API_KEY env vars for tracing. LangServe setup shown in syntax.
from langsmith import traceable from langgraph.graph import StateGraph, END from langchain_core.prompts import ChatPromptTemplate from langchain_openai import ChatOpenAI @traceable class SimpleChain: def __init__(self): self.llm = ChatOpenAI(model_name="gpt-3.5-turbo") self.prompt = ChatPromptTemplate.from_template("Uppercase: {input}") self.chain = self.prompt | self.llm def run(self, input_text): return self.chain.invoke({"input": input_text}).content # LangGraph example (simple) graph = StateGraph(str) graph.add_node("uppercase", lambda x: x.upper()) graph.set_entry_point("uppercase") graph.add_edge("uppercase", END) app = graph.compile() simple_chain = SimpleChain() input_text = "hello world" output_text = simple_chain.run(input_text) print(f"Input: {input_text}") print(f"Output: {output_text}") print(app.invoke("hello langgraph"))
LangSmith: Observability platform for tracing, testing, monitoring LLM apps.
LangGraph: Framework for building reliable, stateful AI agents with cycles, branching.
LangServe: Makes it easy to serve LangChain runnables as REST APIs.
All integrate seamlessly with core LangChain.
LangSmith: Tracing, datasets, evals for LLM apps.
LangGraph: Graphs for complex, cyclical agent workflows.
LangServe: Production-ready deployment of chains/runnables.