update_cache#
- langchain_core.language_models.llms.update_cache(
- cache: BaseCache | bool | None,
- existing_prompts: dict[int, list],
- llm_string: str,
- missing_prompt_idxs: list[int],
- new_results: LLMResult,
- prompts: list[str],
Update the cache and get the LLM output.
- Parameters:
- Returns:
LLM output.
- Raises:
ValueError – If the cache is not set and cache is True.
- Return type:
dict | None