Important disclaimer:
⚠️ This project is not production ready, meant for local environment at this early stage, We quickly built this project to validate the idea, so please excuse any shortcomings in the code. You may come across several areas that require enhancements, and we truly appreciate your support by opening issues, submitting pull requests, and providing suggestions.
OpenChat is an everyday user chatbot console that simplifies the utilization of large language models. With the advancements in AI, the installation and usage of these models have become overwhelming. OpenChat aims to address this challenge by providing a two-step setup process to create a comprehensive chatbot console. It serves as a central hub for managing multiple customized chatbots.
Currently, OpenChat supports GPT models, and we are actively working on incorporating various open-source drivers that can be activated with a single click.
You can try it out on openchat.so
Untitled.mp4
- Create unlimited local chatbots based on GPT-3 (and GPT-4 if available).
- Customize your chatbots by providing PDF files, websites, and soon, integrations with platforms like Notion, Confluence, and Office 365.
- Each chatbot has unlimited memory capacity, enabling seamless interaction with large files such as a 400-page PDF.
- Embed chatbots as widgets on your website or internal company tools.
- Use your entire codebase as a data source for your chatbots (pair programming mode).
- And much more!
- Create unlimited chatbots
- Share chatbots via URL
- Integrate chatbots on any website using JS (as a widget on the bottom right corner)
- Support GPT-3 models
- Support vector database to provide chatbots with larger memory
- Accept websites as a data source
- Accept PDF files as a data source
- Support multiple data sources per chatbot
- Support ingesting an entire codebase using GitHub API and use it as a data source with pair programming mode
- Support pre-defined messages with a single click
- Support Slack integration (allow users to connect chatbots with their Slack workspaces)
- Support Intercom integration (enable users to sync chat conversations with Intercom)
- Support offline open-source models (e.g., Alpaca, LLM drivers)
- Support Vertex AI and Palm as LLMs
- Support Confluence, Notion, Office 365, and Google Workspace
- Refactor the codebase to be API ready
- Create a new UI designer for website-embedded chatbots
- Support custom input fields for chatbots
- Support offline usage: this is a major feature, OpenChat will operate fully offline with no internet connection at this stage (offline LLMs, offline Vector DBs)
We love hearing from you! Got any cool ideas or requests? We're all ears! So, if you have something in mind, give us a shout!
-
Make sure you have docker installed.
-
To begin, clone this Git repository:
git clone [email protected]:openchatai/OpenChat.git
- Update common.env with your keys:
OPENAI_API_KEY=# you can get it from your account in openai.com
PINECONE_API_KEY=# you can get from "API Keys" tab in pinecone
PINECONE_ENVIRONMENT=# you can get it after creating your index in pinecone
PINECONE_INDEX_NAME=# you can get it after creating your index in pinecone
Note: for pincone db, make sure that the dimension is equal to 1536
- Navigate to the repository folder and run the following command (for MacOS or Linux):
make install
or in case you are using Windows
make.bat
Once the installation is complete, you can access the OpenChat console at: http://localhost:8000
Documentation available here
We do our best to not introduce breaking changes, so far, you only need to git pull and run make install
whenever there is a new update.
- To @mayooear for his work and tutorial on chatting with PDF files, we utilized a lot of his code in the LLM server.
This project is licensed under the MIT License.
Thanks goes to these wonderful people (emoji key):
Ikko Eltociear Ashimine 🤔 💻 |
Joshua Sindy 🐛 |
Erjan Kalybek 📖 |
WoahAI 🐛 💻 |
This project follows the all-contributors specification. Contributions of any kind welcome!