class langgraph_agent_toolkit.core.models.factory.CompletionModelFactory[source][source]

Bases: object

Factory for creating model instances.

static init_chat_model(model=None, *, model_provider=None, configurable_fields=None, config_prefix=None, **kwargs)[source][source]
Parameters:
  • model (str | None)

  • model_provider (str | None)

  • configurable_fields (Literal['any'] | ~typing.List[str] | ~typing.Tuple[str, ...] | None)

  • config_prefix (str | None)

  • kwargs (Any)

Return type:

BaseChatModel | _ConfigurableModel

static create(model_provider, model_name=None, configurable_fields=None, config_prefix=None, model_parameter_values=None, **kwargs)[source][source]

Create and return a model instance.

Parameters:
  • model_provider (ModelProvider) – The model provider to use. This should be one of the supported model providers.

  • model_name (str | None) – The name of the model to use. If not provided, the default model name will be used.

  • configurable_fields (Literal['any'] | ~typing.List[str] | ~typing.Tuple[str, ...] | None) – The fields that are configurable. If not provided, the default fields will be used.

  • config_prefix (str | None) – The prefix to use for the configuration. If not provided, the default prefix will be used.

  • model_parameter_values (Tuple[Tuple[str, Any], ...] | None) – The values for the model parameters as a tuple of (key, value) pairs. If not provided, the default values will be used.

  • **kwargs (Any) – Additional keyword arguments to pass to the model.

Returns:

An instance of the requested model

Raises:

ValueError – If the requested model is not supported

Return type:

FakeToolModel | _ConfigurableModel | BaseChatModel

classmethod get_model_from_config(config, **override_params)[source][source]

Create a model from a configuration dictionary.

Parameters:
  • config (Dict[str, Any]) – Model configuration dictionary

  • **override_params – Parameters to override from the configuration

Returns:

A BaseChatModel instance

Return type:

BaseChatModel

Example

>>> config = {"provider": "openai", "name": "gpt-4", "temperature": 0.7}
>>> model = CompletionModelFactory.get_model_from_config(config)
class langgraph_agent_toolkit.core.models.factory.EmbeddingModelFactory[source][source]

Bases: object

Factory for creating embedding model instances.

static create(model_provider, model_name=None, model_parameter_values=None, **kwargs)[source][source]

Create and return an embedding model instance.

Parameters:
  • model_provider (ModelProvider) – The model provider to use. This should be one of the supported model providers.

  • model_name (str | None) – The name of the model to use. If not provided, an error will be raised.

  • model_parameter_values (Tuple[Tuple[str, Any], ...] | None) – The values for the model parameters as a tuple of (key, value) pairs. If not provided, empty dict will be used.

  • **kwargs (Any) – Additional keyword arguments to pass to the model.

Returns:

An instance of the requested embedding model

Raises:

ValueError – If the requested model is not supported or model_name is not provided

Return type:

Embeddings

Examples

>>> model = EmbeddingModelFactory.create(
...     model_provider=ModelProvider.OPENAI,
...     model_name="text-embedding-3-small",
...     openai_api_key="sk-..."
... )
classmethod get_model_from_config(config, **override_params)[source][source]

Create an embedding model from a configuration dictionary.

Parameters:
  • config (Dict[str, Any]) – Model configuration dictionary

  • **override_params – Parameters to override from the configuration

Returns:

An Embeddings instance

Return type:

Embeddings

Example

>>> config = {"provider": "openai", "name": "text-embedding-3-small", "api_key": "sk-..."}
>>> model = EmbeddingModelFactory.get_model_from_config(config)