This is a simple conversational-ui RAG (retrieval augmented generation) based on the Swiss Code of Obligations.
It was created a starting point of the Ginetta Challenge at the women++ Hack'n'Lead hackathon November 2023
Ana R Correia
Karin Sim
Sanaz Reinhardt
Sirinya Richardson
Yaiza Aragonés-Soria
We improved the initial chatbot by introducing the following functionalitie:
- User is asked three onboarding questions (language, familiarity with law and location) such that the answers of the chatbot are tailored to their profile.
- User can interact with the chatbot in English, German or French.
- Design of new user interface.
Emilie.2.mp4
- Use this repository as a template (or Fork it)
- Add your team members as contributors
- Put your presentation in the
docs/
folder - This repository must be open source (and licensed) in order to submit
- Add the tag
hack-n-lead
to the repo description
There is two different ways to setup this project:
- Install Ollama & Qdrant locally (Ollama desktop app is currently is only available for Mac and Linux) - Ollama will take advantage of your GPU to run the model
- Use the Docker Compose file to run Ollama & Qdrant in containers (just run in a terminal in the project directory) - easier setup, but Ollama will run on CPU
docker compose up -d
to pull & run the containersdocker compose exec ollama ollama run mistral
to download & install the mistral model
- 🦙 Download Ollama and install it locally
ollama run mistral
to download and install the model locally (Requires 4.1GB and 8GB of RAM)- Open http://localhost:11434 to check if Ollama is running
docker pull qdrant/qdrant
docker run -p 6333:6333 qdrant/qdrant
Both Option 1 and 2 continue with the following setup:
- Open the Qdrant dashboard console http://localhost:6333/dashboard#/console
- Create a new collection running this:
PUT collections/swiss-or { "vectors": { "size": 384, "distance": "Cosine" } }
- Download the snapshot file
- Unzip the file using the terminal (
⚠️ not with Finder on Mac⚠️ ) withunzip <file_name>
- Upload the file using the following command. Adapt the fields accordingly and run it from the same directory, as where your snapshot lies
curl -X POST 'http://localhost:6333/collections/swiss-or/snapshots/upload' \
-H 'Content-Type:multipart/form-data' \
-F 'snapshot=@swiss-code-of-obligations-articles-gte-small-2023-10-18-12-13-25.snapshot'
- Copy the file
.env.local.example
in the project and rename it to.env
. Verify if all environment variables are correct yarn install
to install the required dependenciesyarn dev
to launch the development server- Go to http://localhost:3000 and try out the app
To learn more about LangChain, OpenAI, Next.js, and the Vercel AI SDK take a look at the following resources:
- LangChain Documentation - learn about LangChain
- Ollama - learn about Ollama features, models, and API
- Qdrant Documentation - learn about Qdrant
- Vercel AI SDK docs - learn mode about the Vercel AI SDK
- Next.js Documentation - learn about Next.js features and API