LangChain - LangSmith Observability
You have two prompt versions:
- prompt_v1: 'Summarize the text: {text}'
- prompt_v2: 'Give a brief summary of: {text}'
You want to compare which prompt produces a shorter summary from the model. How can you automate this comparison in Langchain?
