LangChain - LLM and Chat Model Integration
You want to create a Langchain
ChatOpenAI instance that uses the "gpt-4" model with a temperature of 0.7 and a maximum token limit of 100. Which code snippet correctly sets all these parameters?