SuperAgent makes it easy to configure and deploy LLM Agents to production.
NOTE: The roadmap is ordered based on prios.
- Bring you own DB
- Authentication
- ChatGPT clone
- Built-in memory
- REST API
- Support for multiplpe LLMs
- Streaming support
- Built-in vectorstore
- Built-in document retrieval
- Q&A Agents
- Tools
- ReAct Agents with Tools
- Plan-solve Agents with Tools
- Prompt management
- Bring you own LLM
- Usage quotas and tracking
- Python SDK
- Typescript SDK
- SuperAgent CLI
- One-click deploy (GCP, Amazon, Digitalocean)
-
Clone the repo into a public GitHub repository (or fork https://github.com/homanp/superagent/fork). If you plan to distribute the code, keep the source code public.
git clone https://github.com/homanp/superagent.git
-
Create and activate a virtual environmet.
virtualenv venv source venv/bin/activate
-
Install dependencies with
Poetry
poetry install
-
Set up your .env file
- Duplicate
.env.example
to.env
- Duplicate
-
Run the project
uvicorn app.main:app --reload
Our mission is to make it easy for anyone to create and run LLM Agents in production. We are super happy for any contributions you would like to make. Create new features, fix bugs or improve on infra.
You can read more on how to contribute here.