- class langgraph_agent_toolkit.core.models.factory.ModelFactory[source][source]
Bases:
object
Factory for creating model instances.
- static init_chat_model(model=None, *, model_provider=None, configurable_fields=None, config_prefix=None, **kwargs)[source][source]
- static create(model_provider, model_name=None, configurable_fields=None, config_prefix=None, model_parameter_values=None, **kwargs)[source]
Create and return a model instance.
- Parameters:
model_provider (ModelProvider) – The model provider to use. This should be one of the supported model providers.
model_name (str | None) – The name of the model to use. If not provided, the default model name will be used.
configurable_fields (Literal['any'] | ~typing.List[str] | ~typing.Tuple[str, ...] | None) – The fields that are configurable. If not provided, the default fields will be used.
config_prefix (str | None) – The prefix to use for the configuration. If not provided, the default prefix will be used.
model_parameter_values (Tuple[Tuple[str, Any], ...] | None) – The values for the model parameters as a tuple of (key, value) pairs. If not provided, the default values will be used.
**kwargs (Any) – Additional keyword arguments to pass to the model.
- Returns:
An instance of the requested model
- Raises:
ValueError – If the requested model is not supported
- Return type:
FakeToolModel | _ConfigurableModel | BaseChatModel
- classmethod get_model_from_config(config, **override_params)[source][source]
Create a model from a configuration dictionary.
- Parameters:
- Returns:
A BaseChatModel instance
- Return type:
BaseChatModel
Example
>>> config = {"provider": "openai", "name": "gpt-4", "temperature": 0.7} >>> model = ModelFactory.get_model_from_config(config)