set_llm_cache#
- langchain_core.globals.set_llm_cache(value: BaseCache | None) None [source]#
Set a new LLM cache, overwriting the previous value, if any.
- Parameters:
value (BaseCache | None) – The new LLM cache to use. If None, the LLM cache is disabled.
- Return type:
None
Examples using set_llm_cache