LLMInputOutputAdapter#

class langchain_aws.llms.bedrock.LLMInputOutputAdapter[source]#

Adapter class to prepare the inputs from Langchain to a format that LLM model expects.

It also provides helper function to extract the generated text from the model response.

Attributes

provider_to_output_key_map

Methods

aprepare_output_stream(provider, response[, ...])

prepare_input(provider, model_kwargs[, ...])

prepare_output(provider, response)

prepare_output_stream(provider, response[, ...])

classmethod aprepare_output_stream(
provider: str,
response: Any,
stop: List[str] | None = None,
messages_api: bool = False,
coerce_content_to_string: bool = False,
) AsyncIterator[GenerationChunk | AIMessageChunk][source]#
Parameters:
  • provider (str)

  • response (Any)

  • stop (List[str] | None)

  • messages_api (bool)

  • coerce_content_to_string (bool)

Return type:

AsyncIterator[GenerationChunk | AIMessageChunk]

classmethod prepare_input(
provider: str,
model_kwargs: Dict[str, Any],
prompt: str | None = None,
system: str | None = None,
messages: List[Dict] | None = None,
tools: List[AnthropicTool] | None = None,
*,
max_tokens: int | None = None,
temperature: float | None = None,
) Dict[str, Any][source]#
Parameters:
  • provider (str)

  • model_kwargs (Dict[str, Any])

  • prompt (str | None)

  • system (str | None)

  • messages (List[Dict] | None)

  • tools (List[AnthropicTool] | None)

  • max_tokens (int | None)

  • temperature (float | None)

Return type:

Dict[str, Any]

classmethod prepare_output(
provider: str,
response: Any,
) dict[source]#
Parameters:
  • provider (str)

  • response (Any)

Return type:

dict

classmethod prepare_output_stream(
provider: str,
response: Any,
stop: List[str] | None = None,
messages_api: bool = False,
coerce_content_to_string: bool = False,
) Iterator[GenerationChunk | AIMessageChunk][source]#
Parameters:
  • provider (str)

  • response (Any)

  • stop (List[str] | None)

  • messages_api (bool)

  • coerce_content_to_string (bool)

Return type:

Iterator[GenerationChunk | AIMessageChunk]