Skip to content
/ kunzite Public

local, private, gui for running your inference engine of choice

Notifications You must be signed in to change notification settings

vovw/kunzite

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kunzite

Kunzite is a lightweight, bloat-free GUI application for local AI chat using the lugia inference engine (based on mlx).

Kunzite GUI

Features

  • Local AI: Run AI chat completions entirely on your device
  • MLX Inference Engine: Utilize Apple's MLX framework for fast, efficient inference
  • On-Device Inference: Ensure privacy and offline functionality
  • Bloat-Free GUI: Minimalist interface for distraction-free work

Quick Start

  1. Clone the repository
  2. Install dependencies - imgui, lugia
  3. Run make to build the project
  4. Execute ./kunzite to start the application

Architecture Diagram

  • Kunzite uses a simple client-server architecture
  • GUI: Implemented with Dear ImGui for a lightweight, cross-platform interface
  • LLM Client: Handles communication with the local inference server
  • MLX Server: Runs the language model using the MLX framework using lugia inference engine

License

This project is licensed under the MIT License

About

local, private, gui for running your inference engine of choice

Resources

Stars

Watchers

Forks

Languages