LLMs and AI generally have been advancing massively since late 2022. Most notably - combining your own data with LLMs unlocks incredible use cases.
Yet this now comes at a price.
- The price of privacy. When you search your documents or ask a question, your data is sent to external servers and models - rendering its use potentially prohibitive for many use cases.
- The price of control. The data you received comes from a model and embeddings you probably haven't chosen.
native's goal is to give both privacy and control back to users.
We offer a UI to run LLM apps on your desktop, privately and on your terms. We run on Linux, macOs and Windows.
Download the directly from Github or from native.site and get started 🔥
- Private : Your embeddings and your models stay on your computer by default.
- Powerful : Run the latest models and LLM libraries
- Easy : A simple UI allows everyone to use the latest LLM models.
- Free & Open : Download and run it. Nothing else.
- Customizable : Built to be hackable.
native uses tauri alongside Nextjs and Fastapi to run cross browser.
You can run from the root of the repository.
tauri run dev
To build the desktop app.
More information here
Running the UI and the server separetily is possible.
Run the frontend from the /frontend folder:
nmp run dev
Run the backend from the /backend folder
python3 main.py
Currently under construction. The project is moving fast - it will be up soon.
- Add image generation (Controlnet) task
- Add babyAGI and autoGPT support
- Bug fixes and performance optimizations
We warmly welcome contributions to our open-source project.
You can contribute in various ways such as adding a new feature, enhancing functionalities, or improving documentation.
Please refer to this link for comprehensive guidelines on how to make contributions.
Got Questions or want to help ? Join the conversation in our Discord.