Welcome to my Ollama Chat, this is an interface for the Official ollama CLI to make it easier to chat. It includes futures such as:
- Multiple conversations 💬
- Detech which models are available to use 📋
- Auto check if ollama is running ⏰
- Able to change the host where ollama is running at 🖥️
- Perstistance 📀
- Import & Export Chats 🚛
- Light & Dark Theme 🌗
- Clone the repo
git clone [email protected]:ollama-interface/Ollama-Gui.git
pnpm i
pnpm build:app:silicon
(:silicon or :intell or :universal) depending on your machine- Go to
/src-tauri/target/release/bundle/dmg/*.dmg
and install the program with the .dmg file.
You as well you need to install Ollama and after you installed it, you can run your local server with this command OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve
.
For any questions, please contact Twan Luttik (Twitter - X)