0
0
LangchainHow-ToBeginner ยท 3 min read

How to Use HumanMessage in LangChain: Simple Guide

In LangChain, use the HumanMessage class to represent user input when interacting with language models. Create a HumanMessage instance with the user's text and pass it to the model's chat method to get a response.
๐Ÿ“

Syntax

The HumanMessage class is used to wrap the text input from a user in LangChain's chat framework. It is imported from langchain.schema. You create it by passing the user's message as a string to the constructor.

Typical usage:

  • HumanMessage(content='Your message here'): creates a message object with the user's text.
  • Pass this object inside a list to the chat model's __call__ or chat method.
python
from langchain.schema import HumanMessage

# Create a human message
message = HumanMessage(content='Hello, how are you?')
๐Ÿ’ป

Example

This example shows how to create a HumanMessage and send it to an OpenAI chat model using LangChain. The model replies based on the user's input.

python
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage

# Initialize the chat model
chat = ChatOpenAI(temperature=0)

# Create a human message
user_message = HumanMessage(content='What is the capital of France?')

# Send the message to the model and get a response
response = chat([user_message])

# Print the model's reply
print(response.content)
Output
Paris is the capital of France.
โš ๏ธ

Common Pitfalls

Some common mistakes when using HumanMessage include:

  • Not wrapping the user input in a HumanMessage object, which causes errors when calling the chat model.
  • Passing a plain string instead of a list of messages to the chat model.
  • Forgetting to import HumanMessage from langchain.schema.

Always ensure messages are in a list and properly wrapped.

python
from langchain.chat_models import ChatOpenAI

chat = ChatOpenAI(temperature=0)

# Wrong: passing string directly
# response = chat('Hello')  # This will raise an error

# Right: wrap in HumanMessage and list
from langchain.schema import HumanMessage
response = chat([HumanMessage(content='Hello')])
print(response.content)
Output
Hello! How can I assist you today?
๐Ÿ“Š

Quick Reference

  • Import: from langchain.schema import HumanMessage
  • Create message: HumanMessage(content='text')
  • Send to model: chat([HumanMessage])
  • Always use a list of messages.
โœ…

Key Takeaways

Always wrap user input in a HumanMessage object before sending to LangChain chat models.
Pass messages as a list to the chat model's call method.
Import HumanMessage from langchain.schema to avoid import errors.
HumanMessage represents the user's text in a chat conversation.
Using HumanMessage correctly ensures smooth interaction with language models.