Usage Guide
Setup and Usage
Clone the repository:
git clone https://github.com/kryvokhyzha/langgraph-agent-toolkit cd langgraph-agent-toolkit
Set up your environment (see Environment Setup section)
Run the service (with Python or Docker)
Building Your Own Agent
To customize the agent:
Add your agent to
langgraph_agent_toolkit/agents/blueprints/Register it in
AGENT_PATHSlist inlanggraph_agent_toolkit/core/settings.pyOptionally customize the Streamlit interface in
streamlit_app.py
Docker Setup
The docker-compose.yaml defines these services with enhanced security:
backend-agent-service: FastAPI servicefrontend-streamlit-app: Streamlit chat interfacepostgres: Database storageredis: Cache and message brokerminio: Object storageclickhouse: Analytics databaselangfuse-web&langfuse-worker: Observabilitylitellm: LLM proxy server
Using docker compose watch enables live reloading:
Ensure Docker and Docker Compose (>=2.23.0) are installed
Launch services:
docker compose watch
Access endpoints:
Streamlit app:
http://0.0.0.0:8501Agent API:
http://0.0.0.0:8080API docs:
http://0.0.0.0:8080/docsLangfuse dashboard:
http://0.0.0.0:3000LiteLLM API:
http://0.0.0.0:4000(accessible from any host)
Stop services:
docker compose down
Note
If you modify pyproject.toml or uv.lock, rebuild with
docker compose up --build
Using the AgentClient
The toolkit includes AgentClient for interacting with the agent service:
from client import AgentClient
client = AgentClient()
response = client.invoke({"message": "Tell me a brief joke?"})
response.pretty_print()
# ================================== Ai Message ==================================
#
# A man walked into a library and asked the librarian, "Do you have any books on Pavlov's dogs and Schrödinger's cat?"
# The librarian replied, "It rings a bell, but I'm not sure if it's here or not."
See langgraph_agent_toolkit/run_client.py for more examples.
Development with LangGraph Studio
The project works with LangGraph Studio:
Install LangGraph Studio
Add your
.envfile to the root directoryLaunch LangGraph Studio pointing at the project root
Customize
langgraph.jsonas needed
Local Development Without Docker
Set up a Python environment:
pip install uv uv sync --frozen source .venv/bin/activate
Create and configure your
.envfileRun the FastAPI server:
python langgraph_agent_toolkit/run_service.pyRun the Streamlit app in another terminal:
streamlit run langgraph_agent_toolkit/streamlit_app.py
Access the Streamlit interface (usually at
http://localhost:8501)
Key Features
LangGraph Integration
Latest LangGraph v0.3 features
Human-in-the-loop with
interrupt()Flow control with
Commandandlanggraph-supervisor
API Service
FastAPI with streaming and non-streaming endpoints
Support for both token-based and message-based streaming
Multiple agent support with URL path routing
Available agents and models listed at
/infoendpointSupports different runners: unicorn, gunicorn, mangum (AWS Lambda), azure functions
Developer Experience
Asynchronous design with async/await
Docker configuration with live reloading
Comprehensive testing suite
Enterprise Components
Configurable PostgreSQL/SQLite connection pools
Observability via Langfuse and Langsmith
User feedback system
Prompt management system
LiteLLM proxy integration