Complete the code to enable simple in-memory caching in LangChain.
from langchain.cache import [1] cache = [1]()
InMemoryCache is the built-in simple cache for LangChain that stores data in memory to reduce repeated calls.
Complete the code to set the cache for LangChain's global cache system.
from langchain.cache import cache cache.[1] = InMemoryCache()
The global cache instance is set by assigning to cache.cache.
Fix the error in the code to use RedisCache properly for caching.
from langchain.cache import RedisCache cache = RedisCache(redis_url='[1]')
RedisCache expects a Redis URL starting with 'redis://', including host and port.
Fill both blanks to create a cache key function that uses the input text length and a prefix.
def cache_key_function(input_text): return f"[1]_[2]"
The function returns a string combining a prefix and the length of the input text to create a cache key.
Fill all three blanks to create a dictionary comprehension caching only results with cost below a threshold.
costs = {'a': 10, 'b': 5, 'c': 20}
cached = {k: v for k, v in costs.items() if v [1] [2] and k != [3]This comprehension caches items with cost less than 15 and excludes key 'c'.