Try the default Toolkit application yourself by deploying it in a container locally. You will need to have Docker and Docker-compose >= 2.22 installed.
docker run -e COHERE_API_KEY='>>YOUR_API_KEY<<' -p 8000:8000 -p 4000:4000 ghcr.io/cohere-ai/cohere-toolkit:latest
If you need to use community features, you can run the container with the following command:
docker run -e INSTALL_COMMUNITY_DEPS='true' -e COHERE_API='>>YOUR_API_KEY<<' -p 8000:8000 -p 4000:4000 ghcr.io/cohere-ai/cohere-toolkit:latest
Go to localhost:4000 in your browser and start chatting with the model. This will use the model hosted on Cohere's platform. If you want to add your own tools or use another model, follow the instructions below to fork the repository.
Clone the repo and run
make first-run
Follow the instructions to configure the model - either AWS Sagemaker, Bedrock, Azure, or Cohere's platform. This can also be done by running make setup
(See Option 2 below), which will help generate a file for you, or by manually creating a configuration.yaml
file and copying the contents of the provided configuration.template.yaml
. Then replacing the values with the correct ones.
For Windows systems see the detailed setup below.
Windows
- Install docker
- Install [git]https://git-scm.com/download/win
- In PowerShell (Terminal), install scoop. After installing, run the following commands:
scoop bucket add extras
- Install miniconda using
scoop install miniconda3
conda init cmd.exe
- Restart PowerShell
- Install the following:
scoop install postgresql
scoop install make
- Create a new virtual environment with Python 3.11 using CMD terminal
conda create -n toolkit python=3.11
conda activate toolkit
- Install poetry == 1.7.1 using
pip install poetry==1.7.1
- Clone the repo
- Alternatively to
make win-first-run
ormake win-setup
, run
poetry install --with setup,community --verbose
poetry run python src/backend/cli/main.py
make migrate
make dev
- Navigate to https://localhost:4000 in your browser
- If you encounter on error on running
poetry install
related tollama-cpp-python
, please run the following command:
poetry source add llama-cpp-python https://abetlen.github.io/llama-cpp-python/whl/cpu
poetry source add pypi
poetry lock
and then run the commands in step 10 again. For more information and additional installation instructions, see llama-cpp-python documentation
MacOS
- Install Xcode. This can be done from the App Store or terminal
xcode-select --install
- Install docker desktop
- Install homebrew
- Install pipx. This is useful for installing poetry later.
brew install pipx
pipx ensurepath
- Install [postgres](brew install postgresql)
- Install conda using miniconda
- Use your environment manager to create a new virtual environment with Python 3.11
conda create -n toolkit python=3.11
- Install poetry >= 1.7.1
pipx install poetry
To test if poetry has been installed correctly,
conda activate toolkit
poetry --version
You should see the version of poetry (e.g. 1.8.2). If poetry is not found, try
export PATH="$HOME/.local/bin:$PATH"
And then retry poetry --version
9. Clone the repo and run make first-run
10. Navigate to https://localhost:4000 in your browser
Environment variables
COHERE_API_KEY
: If your application will interface with Cohere's API, you will need to supply an API key. Not required if using AWS Sagemaker or Azure. Sign up at https://dashboard.cohere.com/ to create an API key.NEXT_PUBLIC_API_HOSTNAME
: The backend URL which the frontend will communicate with. Defaults to http://backend:8000 for use withdocker compose
FRONTEND_HOSTNAME
: The URL for the frontend client. Defaults to http://localhost:4000DATABASE_URL
: Your PostgreSQL database connection string for SQLAlchemy, should follow the formatpostgresql+psycopg2://USER:PASSWORD@HOST:PORT
.REDIS_URL
: Your Redis connection string, should follow the formatredis://USER:PASSWORD@HOST:PORT
.
To use the toolkit with AWS Sagemaker you will first need the cohere model (a command version) which powers chat deployed in Sagemaker. Follow Cohere's guide and notebooks to deploy a command model and create an endpoint which can then be used with the toolkit.
Then you will need to set up authorization, see more details here. The default toolkit set up uses the configuration file (after aws configure sso
) with the following environment variables:
SAGE_MAKER_REGION_NAME
: The region you configured for the model.SAGE_MAKER_ENDPOINT_NAME
: The name of the endpoint which you created in the notebook.SAGE_MAKER_PROFILE_NAME
: Your AWS profile name
BEDROCK_ACCESS_KEY
: Your Bedrock access key.BEDROCK_SECRET_KEY
: Your Bedrock secret key.BEDROCK_SESSION_TOKEN
: Your Bedrock session token.BEDROCK_REGION_NAME
: The region you configured for the model.
PYTHON_INTERPRETER_URL
: URL to the python interpreter container. Defaults to http://localhost:8080.TAVILY_API_KEY
: If you want to enable internet search, you will need to supply a Tavily API Key. Not required.
Once your environment variables are set, you're ready to deploy the Toolkit locally! Pull the Docker images from Github Artifact registry or build files from source. See the Makefile
for all available commands.
Requirements:
Ensure your shell is authenticated with GHCR.
Pull the Single Container Image from Github's Artifact Registry
docker pull ghcr.io/cohere-ai/cohere-toolkit:latest
Run the images locally:
docker run --name=cohere-toolkit -itd -e COHERE_API_KEY='Your Cohere API key here' -p 8000:8000 -p 4000:4000 ghcr.io/cohere-ai/cohere-toolkit
Run make first-run
to start the CLI, that will generate the configuration.yaml
and secrets.yaml
files for you. This will also run all the DB migrations and run the containers
make first-run
Run make setup
to start the CLI, that will generate the configuration.yaml
and secrets.yaml
files for you:
make setup
Then run:
make migrate
make dev
If you did not change the default port, visit http://localhost:4000/ in your browser to chat with the model.
Use for configuring and adding new retrieval chains.
Install your development dependencies:
poetry install --with dev
If you also need to install the community features, run:
poetry install --with community
The codebase is formatted and linted using Ruff.
To check for linter and formatter errors, run
make lint
To apply automatic fixes, run
make lint-fix
Run type checker:
- See docs for pyright
- Install with
conda install pyright
- Run with
pyright
- Configure in pyproject.toml under
[tool.pyright]
- Install the Ruff VSCode Extension
- Copy the contents of
.vscode/settings.default.json
into.vscode/settings.json
Please confirm that you have at least one configuration of the Cohere Platform, SageMaker, Bedrock or Azure.
You have two methods to set up the environment variables:
- Run
make setup
and follow the instructions to configure it. - Copy the contents of
configuration.template.yaml
andsecrets.template.yaml
files to newconfiguration.yaml
andsecrets.yaml
files.
The docker-compose file should spin up a local db
container with a PostgreSQL server. The first time you setup this project, and whenever new migrations are added, you will need to run:
make migrate
This will apply all existing database migrations and ensure your DB schema is up to date.
If ever you run into issues with Alembic, such as being out of sync and your DB does not contain any data you'd like to preserve, you can run:
make reset-db
make migrate
make dev
This will delete the existing db
container volumes, restart the containers and reapply all migrations.
Install the Python dependencies:
make install
Spin up the Test DB required by the tests:
make test-db
Run the tests:
make run-tests
When making changes to any of the database models, such as adding new tables, modifying or removing columns, you will need to create a new Alembic migration. You can use the following Make command:
make migration message="Your migration message"
Important: If adding a new table, make sure to add the import to the model/__init__.py
file! This will allow Alembic to import the models and generate migrations accordingly.
This should generate a migration on the Docker container and be copied to your local /alembic
folder. Make sure the new migration gets created.
Then you can migrate the changes to the PostgreSQL Docker instance using:
make migrate