Skip to content

Commit

Permalink
Merge branch 'textgenui' into dev
Browse files Browse the repository at this point in the history
  • Loading branch information
huynle committed Jan 9, 2024
2 parents 4ac7404 + 868809f commit 67abad6
Show file tree
Hide file tree
Showing 22 changed files with 776 additions and 348 deletions.
49 changes: 35 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,21 +20,36 @@ For a comprehensive understanding of the extension's functionality, you can watc
+ [ ] clean up documentation
+ [x] original functionality of OGPT.nvim with Ollama
+ [x] Custom settings per session
+ [ ] Add/remove settings as Ollama [request options](https://github.com/jmorganca/ollama/blob/main/docs/api.md#request-with-options)
+ [ ] Change Settings -> [Parameters](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#parameter)
+ [x] Add/remove settings as Ollama [request options](https://github.com/jmorganca/ollama/blob/main/docs/api.md#request-with-options)
+ [x] Change Settings -> [Parameters](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#parameter)
+ [ ] Another Windows for [Template](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#template), [System](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#system)
+ [ ] Query and Select model from Ollama
+ [x] Query and Select model from Ollama
+ [ ] Support model creation on the fly

Change Model by Opening the Parameter panels default to (ctrl-o) and Tab your way to it
then press Enter (<cr>) on the model field to change it. It should list all the available models on
the your Ollama server.
![Change Model](assets/images/change_model.png)

`OGPT` is a Neovim plugin that allows you to effortlessly utilize the Ollama OGPT API,
empowering you to generate natural language responses from Ollama's OGPT directly within the editor in response to your inquiries.
Same with changing the model, add and delete parameters by using the keys "a" and "d" respectively
![Additional Ollama Parameters](assets/images/addl_params.png)

![preview image](https://github.com/jackMort/OGPT.nvim/blob/media/preview-2.png?raw=true)

## OGPT Enhancement from Original ChatGPT.nvim
+ [x] additional actions can be added to config options
+ [x] running `OGPTRun` shows telescope picker
+ [x] for `type="chat"` and `strategy="display"`, "r" and "a" can be used to "replace the
highlighted text" or "append after the highlighted text", respectively. Otherwise, "esc" or
"ctrl-c" would exit the popup


## Installation

`OGPT` is a Neovim plugin that allows you to effortlessly utilize the Ollama OGPT API,
empowering you to generate natural language responses from Ollama's OGPT directly within the editor in response to your inquiries.

![preview image](https://github.com/jackMort/OGPT.nvim/blob/media/preview-2.png?raw=true)

- Make sure you have `curl` installed.
- Have a local instance of Ollama running.

Expand Down Expand Up @@ -90,8 +105,15 @@ model.

![preview image](https://github.com/jackmort/ChatGPT.nvim/blob/media/preview-3.png?raw=true)

#### `OGPTEditWithInstructions`
`OGPTEditWithInstructions` command which opens interactive window to edit selected text or whole window using the `codellama:13b` model (GPT 3.5 fine-tuned for coding).
#### `OGPTRun edit_with_instructions`
`OGPTRun edit_with_instructions` command which opens interactive window to edit selected text or
whole window using the `deepseek-coder:6.7b` model, you can change in this in your config options

#### `OGPTRun edit_code_with_instructions`
This command opens an interactive window to edit selected text or the entire window using the
`deepseek-coder:6.7b` model. You can modify this in your config options. The Ollama response will
be extracted for its code content, and if it doesn't contain any codeblock, it will default back to
the full response.

You can map it using the Lua API, e.g. using `which-key.nvim`:
```lua
Expand Down Expand Up @@ -164,12 +186,11 @@ The `edit` strategy consists in showing the output side by side with the input a
available for further editing requests
For now, `edit` strategy is implemented for `chat` type only.

The `display` strategy shows the output in a float window.

`append` and `replace` modify the text directly in the buffer.
The `display` strategy shows the output in a float window.
`append` and `replace` modify the text directly in the buffer with "a" or "r"

### Interactive popup
When using `OGPT` and `OGPTEditWithInstructions`, the following
When using `OGPT`, the following
keybindings are available:
- `<C-Enter>` [Both] to submit.
- `<C-y>` [Both] to copy/yank last answer.
Expand Down Expand Up @@ -197,8 +218,8 @@ Add these to your [whichkey](https://github.com/folke/which-key.nvim) plugin map
```lua
c = {
name = "OGPT",
c = { "<cmd>OGPT<CR>", "OGPT" },
e = { "<cmd>OGPTEditWithInstruction<CR>", "Edit with instruction", mode = { "n", "v" } },
e = { "<cmd>OGPTRun edit_with_instructions<CR>", "Edit with instruction", mode = { "n", "v" } },
c = { "<cmd>OGPTRun edit_code_with_instructions<CR>", "Edit code with instruction", mode = { "n", "v" } },
g = { "<cmd>OGPTRun grammar_correction<CR>", "Grammar Correction", mode = { "n", "v" } },
t = { "<cmd>OGPTRun translate<CR>", "Translate", mode = { "n", "v" } },
k = { "<cmd>OGPTRun keywords<CR>", "Keywords", mode = { "n", "v" } },
Expand Down
Binary file added assets/images/addl_params.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/images/change_model.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
77 changes: 74 additions & 3 deletions lua/ogpt.lua
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,13 @@
local api = require("ogpt.api")
local module = require("ogpt.module")
local config = require("ogpt.config")
local conf = require("telescope.config").values
local signs = require("ogpt.signs")
local pickers = require("telescope.pickers")
local Utils = require("ogpt.utils")
local Config = require("ogpt.config")
local action_state = require("telescope.actions.state")
local actions = require("telescope.actions")

local M = {}

Expand All @@ -26,6 +32,53 @@ M.setup = function(options)
signs.setup()
end

--- select Action
local finder = function(action_definitions)
return setmetatable({
close = function() end,
}, {
__call = function(_, prompt, process_result, process_complete)
for key, action_opts in pairs(action_definitions) do
process_result({
value = key,
display = key,
ordinal = key,
})
end
process_complete()
end,
})
end

function M.select_action(opts)
opts = opts or {}

local ActionFlow = require("ogpt.flows.actions")
local action_definitions = ActionFlow.read_actions()
pickers
.new(opts, {
sorting_strategy = "ascending",
layout_config = {
height = 0.5,
},
results_title = "Select Ollama action",
prompt_prefix = Config.options.popup_input.prompt,
selection_caret = Config.options.chat.answer_sign .. " ",
prompt_title = "actions",
finder = finder(action_definitions),
sorter = conf.generic_sorter(),
attach_mappings = function(prompt_bufnr)
actions.select_default:replace(function()
actions.close(prompt_bufnr)
local selection = action_state.get_selected_entry()
opts.cb(selection.display, selection.value)
end)
return true
end,
})
:find()
end

--
-- public methods for the plugin
--
Expand All @@ -34,16 +87,34 @@ M.openChat = function()
module.open_chat()
end

M.focusChat = function()
module.focus_chat()
end

M.selectAwesomePrompt = function()
module.open_chat_with_awesome_prompt()
end

M.edit_with_instructions = function()
module.edit_with_instructions()
M.edit_with_instructions = function(opts)
module.edit_with_instructions(nil, nil, nil, opts)
end

M.run_action = function(opts)
module.run_action(opts)
if opts.args == "" then
M.select_action({
cb = function(key, value)
local _opts = vim.tbl_extend("force", opts, {
args = key,
fargs = {
key,
},
})
module.run_action(_opts)
end,
})
else
module.run_action(opts)
end
end

M.complete_code = module.complete_code
Expand Down
14 changes: 8 additions & 6 deletions lua/ogpt/api.lua
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,14 @@ function Api.completions(custom_params, cb)
Api.make_call(Api.COMPLETIONS_URL, params, cb)
end

function Api.chat_completions(custom_params, cb, should_stop)
function Api.chat_completions(custom_params, cb, should_stop, opts)
local params = vim.tbl_extend("keep", custom_params, Config.options.api_params)
local stream = params.stream or false
local ctx = {}
-- add params before conform
ctx.params = params
local _model = params.model
params.model = nil
if stream then
params = Utils.conform_to_ollama(params)
local raw_chunks = ""
Expand All @@ -30,7 +32,7 @@ function Api.chat_completions(custom_params, cb, should_stop)
"--silent",
"--show-error",
"--no-buffer",
Api.CHAT_COMPLETIONS_URL,
Utils.update_url_route(Api.CHAT_COMPLETIONS_URL, _model),
"-H",
"Content-Type: application/json",
"-H",
Expand Down Expand Up @@ -86,7 +88,7 @@ end

function Api.edits(custom_params, cb)
local params = vim.tbl_extend("keep", custom_params, Config.options.api_edit_params)
params.stream = false
params.stream = params.stream or false
Api.make_call(Api.CHAT_COMPLETIONS_URL, params, cb)
end

Expand Down Expand Up @@ -275,9 +277,9 @@ function Api.setup()
loadApiHost("OLLAMA_API_HOST", "OLLAMA_API_HOST", "api_host_cmd", function(value)
Api.OLLAMA_API_HOST = value
Api.MODELS_URL = ensureUrlProtocol(Api.OLLAMA_API_HOST .. "/api/tags")
Api.COMPLETIONS_URL = ensureUrlProtocol(Api.OLLAMA_API_HOST .. "/api/generate")
Api.CHAT_COMPLETIONS_URL = ensureUrlProtocol(Api.OLLAMA_API_HOST .. "/api/generate")
end, "http://localhost:11434/api/generate")
Api.COMPLETIONS_URL = ensureUrlProtocol(Api.OLLAMA_API_HOST .. "/")
Api.CHAT_COMPLETIONS_URL = ensureUrlProtocol(Api.OLLAMA_API_HOST .. "/")
end, "http://localhost:11434")

loadApiKey("OLLAMA_API_KEY", "OLLAMA_API_KEY", "api_key_cmd", function(value)
Api.OLLAMA_API_KEY = value
Expand Down
25 changes: 1 addition & 24 deletions lua/ogpt/common/preview_window.lua
Original file line number Diff line number Diff line change
Expand Up @@ -4,30 +4,7 @@ local Config = require("ogpt.config")
local PreviewWindow = Popup:extend("PreviewWindow")

function PreviewWindow:init(options)
options = vim.tbl_deep_extend("keep", options or {}, {
position = 1,
size = {
width = "40%",
height = 10,
},
padding = { 1, 1, 1, 1 },
enter = true,
focusable = true,
zindex = 50,
border = {
style = "rounded",
},
buf_options = {
modifiable = false,
readonly = true,
filetype = "markdown",
},
win_options = {
wrap = true,
linebreak = true,
winhighlight = "Normal:Normal,FloatBorder:FloatBorder",
},
})
options = vim.tbl_deep_extend("keep", options or {}, Config.options.preview_window)

PreviewWindow.super.init(self, options)
end
Expand Down
82 changes: 72 additions & 10 deletions lua/ogpt/config.lua
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ function M.defaults()
toggle_diff = "<C-d>",
toggle_parameters = "<C-o>",
cycle_windows = "<Tab>",
use_output_as_input = "<C-i>",
use_output_as_input = "<C-u>",
},
},
chat = {
Expand All @@ -30,8 +30,8 @@ function M.defaults()
border_right_sign = "|",
max_line_length = 120,
sessions_window = {
active_sign = " ",
inactive_sign = " ",
active_sign = " 󰄵 ",
inactive_sign = " 󰄱 ",
current_line_sign = "",
border = {
style = "rounded",
Expand Down Expand Up @@ -139,22 +139,84 @@ function M.defaults()
winhighlight = "Normal:Normal,FloatBorder:FloatBorder",
},
},
preview_window = {
position = 1,
size = {
width = "40%",
height = 10,
},
padding = { 1, 1, 1, 1 },
enter = true,
focusable = true,
zindex = 50,
border = {
style = "rounded",
},
buf_options = {
modifiable = false,
readonly = true,
filetype = "markdown",
},
win_options = {
wrap = true,
linebreak = true,
winhighlight = "Normal:Normal,FloatBorder:FloatBorder",
},
},

api_params = {
model = "mistral:7b",
-- max_tokens = 300,
model = "mixtral-8-7b-moe-instruct-tgi-predictor-ai-factory",
temperature = 0.8,
top_p = 1,
-- n = 1,
top_p = 0.9,
},
api_edit_params = {
model = "codellama:13b",
model = "mistral:7b",
frequency_penalty = 0,
presence_penalty = 0,
temperature = 0.5,
top_p = 1,
-- n = 1,
top_p = 0.9,
},
use_openai_functions_for_edits = false,
actions = {

code_completion = {
type = "chat",
opts = {
system = [[You are a CoPilot; a tool that uses natural language processing (NLP)
techniques to generate and complete code based on user input. You help developers write code more quickly and efficiently by
generating boilerplate code or completing partially written code. Respond with only the resulting code snippet. This means:
1. Do not include the code context that was given
2. Only place comments in the code snippets
]],
strategy = "display",
params = {
model = "deepseek-coder:6.7b",
},
},
},

edit_code_with_instructions = {
type = "chat",
opts = {
strategy = "edit_code",
delay = true,
params = {
model = "deepseek-coder:6.7b",
},
},
},

edit_with_instructions = {
type = "chat",
opts = {
strategy = "edit",
delay = true,
params = {
model = "mistral:7b",
},
},
},
},
actions_paths = {},
show_quickfixes_cmd = "Trouble quickfix",
predefined_chat_gpt_prompts = "https://raw.githubusercontent.com/f/awesome-chatgpt-prompts/main/prompts.csv",
Expand Down
Loading

0 comments on commit 67abad6

Please sign in to comment.