langgraph_agent_toolkit.agents.components.creators.create_react_agent.create_react_agent(model, tools, *, prompt=None, response_format=None, pre_model_hook=None, state_schema=None, config_schema=None, checkpointer=None, store=None, interrupt_before=None, interrupt_after=None, debug=False, version='v1', name=None, immediate_step_threshold=5, immediate_generation_prompt=None)[source][source]

Create a graph that works with a chat model that utilizes tool calling with an additional router.

This implementation extends the original create_react_agent by adding a router node that checks remaining steps and routes to either the agent or an immediate generation node when the remaining steps are below a threshold.

Parameters:
  • model (str | Runnable[PromptValue | str | Sequence[BaseMessage | list[str] | tuple[str, str] | str | dict[str, Any]], BaseMessage | str]) – The LangChain chat model that supports tool calling.

  • tools (Sequence[BaseTool | Callable] | ToolNode) – A list of tools or a ToolNode instance.

  • prompt (SystemMessage | str | Callable[[Any], PromptValue | str | Sequence[BaseMessage | list[str] | tuple[str, str] | str | dict[str, Any]]] | Runnable[Any, PromptValue | str | Sequence[BaseMessage | list[str] | tuple[str, str] | str | dict[str, Any]]] | None) – An optional prompt for the LLM.

  • response_format (dict | type[BaseModel] | tuple[str, dict | type[BaseModel]] | None) – An optional schema for the final agent output.

  • pre_model_hook (Runnable[Input, Output] | Callable[[Input], Output] | Callable[[Input], Awaitable[Output]] | Callable[[Iterator[Input]], Iterator[Output]] | Callable[[AsyncIterator[Input]], AsyncIterator[Output]] | _RunnableCallableSync[Input, Output] | _RunnableCallableAsync[Input, Output] | _RunnableCallableIterator[Input, Output] | _RunnableCallableAsyncIterator[Input, Output] | Mapping[str, Any] | _RunnableWithWriter[Input, Output] | _RunnableWithStore[Input, Output] | _RunnableWithWriterStore[Input, Output] | _RunnableWithConfigWriter[Input, Output] | _RunnableWithConfigStore[Input, Output] | _RunnableWithConfigWriterStore[Input, Output] | None) – An optional node to add before the agent node.

  • state_schema (Type[Any] | None) – An optional state schema that defines graph state.

  • config_schema (Type[Any] | None) – An optional schema for configuration.

  • checkpointer (None | bool | BaseCheckpointSaver) – An optional checkpoint saver object.

  • store (BaseStore | None) – An optional store object.

  • interrupt_before (list[str] | None) – An optional list of node names to interrupt before.

  • interrupt_after (list[str] | None) – An optional list of node names to interrupt after.

  • debug (bool) – A flag indicating whether to enable debug mode.

  • version (Literal['v1', 'v2']) – Determines the version of the graph to create (‘v1’ or ‘v2’).

  • name (str | None) – An optional name for the CompiledStateGraph.

  • immediate_step_threshold (int) – Number of remaining steps below which the router will use immediate generation.

  • immediate_generation_prompt (str | None) – Optional custom prompt for the immediate generation mode. If not provided, a default prompt will be used instructing the model to generate a direct answer.

Returns:

A compiled LangChain runnable that can be used for chat interactions.

Return type:

CompiledGraph