AItoComplete is an Emacs package that provides a convenient interface to Ollama, allowing you to access LLM capabilities directly from your Emacs workflow.
- Emacs 26.1 or higher
- Ollama installed and running locally
- curl command-line tool
- Download
aitocomplete.el
to a directory in your Emacs load path. - Add the following to your
init.el
:
(require 'aitocomplete)
(use-package aitocomplete
:straight (:host github :repo "hipml/aitocomplete")
:bind (("C-c s" . aitocomplete-send-region)
("C-c a" . aitocomplete-menu)))
(use-package aitocomplete
:ensure t)
The package works out of the box, but you can customize it to your preferences:
;; Set your preferred model
(setq aitocomplete-model "llama3.2") ; Default model
;; Customize buffer name
(setq aitocomplete-chat-buffer "*My AI Assistant*")
;; Set the number of columns in the menu
(setq aitocomplete-menu-columns 4)
;; Custom keybinding for the menu
(global-set-key (kbd "C-c a") #'aitocomplete-menu)
- Start the Ollama server on your local machine
- Select a region of text in any buffer
- Press
C-c s
to send the region to the AI - View the AI's response in the chat buffer
M-x aitocomplete-menu
: Display the main menu with all optionsM-x aitocomplete-send-region
: Send the selected region to the AIM-x aitocomplete-chat
: Open or switch to the AI chat bufferM-x aitocomplete-test-response
: Test the AI connection with a simple prompt
- Type your message
- Press
C-c C-c
to send it to the AI - Press
?
to display the menu
Keybinding | Command | Description |
---|---|---|
C-c s |
aitocomplete-send-region |
Send selected text to AI |
C-c C-c |
aitocomplete-send-message |
(In chat buffer) Send message |
? |
aitocomplete-menu |
(In chat buffer) Display menu |
Press M-x aitocomplete-menu
or ?
in the chat buffer to access these options:
[o]
Open chat buffer[s]
Send region[m]
Change model[t]
Test response[q]
Quit menu
By default, the package supports whatever models you have available in your local Ollama installation. You can check available models by opening the menu.
If you see "NOT RUNNING" in the menu, make sure:
- Ollama is installed on your system
- The Ollama server is running
- It's accessible at http://localhost:11434
Enable debug messages to help troubleshoot:
(setq debug-on-error t)
Contributions are welcome! Feel free to open issues or submit pull requests on GitHub.
This project is licensed under the MIT License - see the LICENSE file for details.
AItoComplete was inspired by similar projects that integrate AI capabilities into Emacs.