Skip to content

update llama example to use chat ml format for minstral model #400

update llama example to use chat ml format for minstral model

update llama example to use chat ml format for minstral model #400

Workflow file for this run

name: PyCape CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
pycape-check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v4
with:
python-version: "3.10"
cache: 'pip'
- name: Install dev dependencies.
run: make pydep
- name: Install PyCape package.
run: make pylib
- name: Lint PyCape.
run: make lint
- name: Test PyCape.
run: pytest pycape
release-notes:
permissions:
contents: write
pull-requests: read
runs-on: ubuntu-latest
steps:
- uses: release-drafter/release-drafter@v5
with:
disable-autolabeler: true
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}