langgraph_agent_toolkit.client package
- class langgraph_agent_toolkit.client.AgentClient(base_url='http://0.0.0.0', agent=None, timeout=None, get_info=True, verify=False)[source][source]
Bases:
object
Client for interacting with the agent service.
Initialize the client.
- Parameters:
base_url (str) – The base URL of the agent service.
agent (str) – The name of the default agent to use.
timeout (float, optional) – The timeout for requests.
get_info (bool, optional) – Whether to fetch agent information on init. Default: True
verify (bool, optional) – Whether to verify the agent information. Default: False
- __init__(base_url='http://0.0.0.0', agent=None, timeout=None, get_info=True, verify=False)[source][source]
Initialize the client.
- Parameters:
base_url (str) – The base URL of the agent service.
agent (str) – The name of the default agent to use.
timeout (float, optional) – The timeout for requests.
get_info (bool, optional) – Whether to fetch agent information on init. Default: True
verify (bool, optional) – Whether to verify the agent information. Default: False
- Return type:
None
- async aadd_messages(messages, thread_id=None, user_id=None)[source][source]
Add messages to chat history asynchronously.
- Parameters:
- Return type:
- async aclear_history(thread_id=None, user_id=None)[source][source]
Clear chat history asynchronously.
- Parameters:
- Return type:
- async acreate_feedback(run_id, key, score, kwargs={}, user_id=None)[source][source]
Create a feedback record for a run.
- add_messages(messages, thread_id=None, user_id=None)[source][source]
Add messages to chat history.
- Parameters:
- Return type:
- async aget_history(thread_id, user_id=None)[source][source]
Get chat history asynchronously.
- Parameters:
- Return type:
- async ainvoke(input, model_name=None, model_provider=None, model_config_key=None, thread_id=None, user_id=None, agent_config=None, recursion_limit=None)[source][source]
Invoke the agent asynchronously. Only the final message is returned.
- Parameters:
input (Dict[str, Any]) – The input to send to the agent
model_name (str, optional) – LLM model to use for the agent
model_provider (str, optional) – LLM model provider to use for the agent
model_config_key (str, optional) – Key for predefined model configuration
thread_id (str, optional) – Thread ID for continuing a conversation
user_id (str, optional) – User ID for identifying the user
agent_config (dict[str, Any], optional) – Additional configuration to pass through to the agent
recursion_limit (int, optional) – Recursion limit for the agent
- Returns:
The response from the agent
- Return type:
- async astream(input, model_name=None, model_provider=None, model_config_key=None, thread_id=None, user_id=None, agent_config=None, recursion_limit=None, stream_tokens=True)[source][source]
Stream the agent’s response asynchronously.
Each intermediate message of the agent process is yielded as a ChatMessage. If stream_tokens is True (the default value), the response will also yield content tokens from streaming models as they are generated.
- Parameters:
input (Dict[str, Any]) – The input to send to the agent
model_name (str, optional) – LLM model to use for the agent
model_provider (str, optional) – LLM model provider to use for the agent
model_config_key (str, optional) – Key for predefined model configuration
thread_id (str, optional) – Thread ID for continuing a conversation
user_id (str, optional) – User ID for identifying the user
agent_config (dict[str, Any], optional) – Additional configuration to pass through to the agent
recursion_limit (int, optional) – Recursion limit for the agent
stream_tokens (bool, optional) – Stream tokens as they are generated Default: True
- Returns:
The response from the agent
- Return type:
AsyncGenerator[ChatMessage | str, None]
- clear_history(thread_id=None, user_id=None)[source][source]
Clear chat history.
- Parameters:
- Return type:
- create_feedback(run_id, key, score, kwargs={}, user_id=None)[source][source]
Create a feedback record for a run.
- Parameters:
- Return type:
- invoke(input, model_name=None, model_provider=None, model_config_key=None, thread_id=None, user_id=None, agent_config=None, recursion_limit=None)[source][source]
Invoke the agent synchronously. Only the final message is returned.
- Parameters:
input (Dict[str, Any]) – The input to send to the agent
model_name (str, optional) – LLM model to use for the agent
model_provider (str, optional) – LLM model provider to use for the agent
model_config_key (str, optional) – Key for predefined model configuration
thread_id (str, optional) – Thread ID for continuing a conversation
user_id (str, optional) – User ID for identifying the user
agent_config (dict[str, Any], optional) – Additional configuration to pass through to the agent
recursion_limit (int, optional) – Recursion limit for the agent
- Returns:
The response from the agent
- Return type:
- stream(input, model_name=None, model_provider=None, model_config_key=None, thread_id=None, user_id=None, agent_config=None, recursion_limit=None, stream_tokens=True)[source][source]
Stream the agent’s response synchronously.
Each intermediate message of the agent process is yielded as a ChatMessage. If stream_tokens is True (the default value), the response will also yield content tokens from streaming models as they are generated.
- Parameters:
input (Dict[str, Any]) – The input to send to the agent
model_name (str, optional) – LLM model to use for the agent
model_provider (str, optional) – LLM model provider to use for the agent
model_config_key (str, optional) – Key for predefined model configuration
thread_id (str, optional) – Thread ID for continuing a conversation
user_id (str, optional) – User ID for identifying the user
agent_config (dict[str, Any], optional) – Additional configuration to pass through to the agent
recursion_limit (int, optional) – Recursion limit for the agent
stream_tokens (bool, optional) – Stream tokens as they are generated Default: True
- Returns:
The response from the agent
- Return type:
Generator[ChatMessage | str, None, None]
- exception langgraph_agent_toolkit.client.AgentClientError[source][source]
Bases:
Exception
- __init__(*args, **kwargs)
- add_note()
Exception.add_note(note) – add a note to the exception
- args
- with_traceback()
Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.
Submodules
AgentClientError
AgentClient
AgentClient.__init__()
AgentClient.retrieve_info()
AgentClient.update_agent()
AgentClient.ainvoke()
AgentClient.invoke()
AgentClient.stream()
AgentClient.astream()
AgentClient.acreate_feedback()
AgentClient.get_history()
AgentClient.aget_history()
AgentClient.clear_history()
AgentClient.aclear_history()
AgentClient.add_messages()
AgentClient.aadd_messages()
AgentClient.create_feedback()