Skip to content

Commit

Permalink
Merge branch 'dev' of https://github.com/huynle/ogpt.nvim into dev
Browse files Browse the repository at this point in the history
  • Loading branch information
huynle committed Feb 6, 2024
2 parents 7ec31e2 + 7a9df3d commit d4395cf
Show file tree
Hide file tree
Showing 4 changed files with 142 additions and 110 deletions.
145 changes: 91 additions & 54 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,37 +10,12 @@
- **Mix-match Provider**: default provider is used, but you can mix and match different provider AND specific model to different actions.
- **Interactive Q&A**: Engage in interactive question-and-answer sessions with the powerful gpt model (OGPT) using an intuitive interface.
- **Persona-based Conversations**: Explore various perspectives and have conversations with different personas by selecting prompts from Awesome ChatGPT Prompts.
- **Customizable Actions**: Execute a range of actions utilizing the gpt model, such as grammar correction, translation, keyword generation, docstring creation, test addition, code optimization, summarization, bug fixing, code explanation, Roxygen editing, and code readability analysis. Additionally, you can define your own custom actions using a JSON file.
- **Customizable Actions**: Execute a range of actions utilizing the gpt model, such as grammar
correction, translation, keyword generation, docstring creation, test addition, code
optimization, summarization, bug fixing, code explanation, and code readability
analysis. Additionally, you can define your own custom actions using a JSON file or just through
plugin configurations.

For a comprehensive understanding of the extension's functionality, you can watch a plugin showcase [video](https://www.youtube.com/watch?v=7k0KZsheLP4)

## OGPT Specific Features:
+ [x] Use default provider, but can be overriden at anytime for specific action
+ [x] original functionality of ChatGPT.nvim to work with Ollama, TextGenUI(huggingface), OpenAI via `providers`
+ Look at the "default_provider" in the `config.lua`, default is `ollama`
+ look at "providers" for the provider default options
+ [x] Choose different provider and model for "edit" and "chat"
+ [x] Custom settings per session
+ [x] Add/remove parameters in Chat and Edit
+ [x] Choose provider, as well as model for Chat and Edit
+ [x] Customizable actions, with specific provider and model
+ [ ] Another Windows for [Template](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#template), [System](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#system)
+ [x] Framework to add more providers
+ [x] clean up documentation
+ [x] additional actions can be added to config options, or additional json. Look in "config.actions", and "config.actions_paths"
+ [x] running `OGPTRun` shows telescope picker
+ [x] for `type="popup"` and `strategy="display" -- or append, prepend, replace, quick_fix`, "r" and "a" can be used to "replace the
+ [x] model alias for each provider
highlighted text" or "append after the highlighted text", respectively. Otherwise, "esc" or
"ctrl-c" would exit the popup. You can update the mapping in your config options.

Change Model by Opening the Parameter panels default to (ctrl-o) and Tab your way to it
then press Enter (<cr>) on the model field to change it. It should list all the available models on
from your LLM provider.
![Change Model](assets/images/change_model.png)

Same with changing the model, add and delete parameters by using the keys "a" and "d" respectively
![Additional Parameters](assets/images/addl_params.png)

## Installation

Expand Down Expand Up @@ -72,7 +47,7 @@ your endpoint.

## Configuration

`OGPT.nvim` comes with the following defaults, you can override them by passing config as setup param
`OGPT.nvim` comes with the following defaults, you can override any of the field by passing config as setup param.

https://github.com/huynle/ogpt.nvim/blob/main/lua/ogpt/config.lua

Expand Down Expand Up @@ -101,8 +76,6 @@ Plugin exposes following commands:
#### `OGPTActAs`
`OGPTActAs` command which opens a prompt selection from [Awesome OGPT Prompts](https://github.com/f/awesome-chatgpt-prompts) to be used with the `mistral:7b` model.

![preview image](https://github.com/jackmort/ChatGPT.nvim/blob/media/preview-3.png?raw=true)

#### `OGPTRun edit_with_instructions` `OGPTRun edit_with_instructions` command which opens
interactive window to edit selected text or whole window using the `deepseek-coder:6.7b` model, you
can change in this in your config options. This model defined in `config.api_edit_params`.
Expand All @@ -121,7 +94,7 @@ specific action.
##### Using Actions.json
`OGPTRun [action]` command which runs specific actions -- see [`actions.json`](./lua/ogpt/flows/actions/actions.json) file for a detailed list. Available actions are:

It is possible to define custom actions with a JSON file. See [`actions.json`](./lua/ogpt/flows/actions/actions.json) for an example. The path of custom actions can be set in the config (see `actions_paths` field in the config example above).
It is possible to define custom actions using a JSON file. Please see the example at [`actions.json`](./lua/ogpt/flows/actions/) for reference. The path to custom actions can be configured (see `actions_paths` field in the config example above).

An example of custom action may look like this: (`#` marks comments)
```python
Expand All @@ -147,7 +120,7 @@ An example of custom action may look like this: (`#` marks comments)
}
```

##### Using Configuration Options
##### Using Configuration Options (preferred)
```lua

--- config options lua
Expand Down Expand Up @@ -182,6 +155,16 @@ available for further editing requests
The `display` strategy shows the output in a float window.
`append` and `replace` modify the text directly in the buffer with "a" or "r"

### Interactive Chat Parameters

* Change Model by Opening the Parameter panels default to (ctrl-o) or <Tab> your way to it
then press Enter (<cr>) on the model field to change it. It should list all the available models on
from your LLM provider.
![Change Model](assets/images/change_model.png)

* In the **Parameter** panel, add and delete parameters by using the keys "a" and "d" respectively
![Additional Parameters](assets/images/addl_params.png)

### Interactive Popup for Chat popup
When using `OGPT`, the following
keybindings are available under `config.chat.keymaps`
Expand All @@ -198,7 +181,7 @@ https://github.com/huynle/ogpt.nvim/blob/main/lua/ogpt/config.lua#L174-L181
When the setting window is opened (with `<C-o>`), settings can be modified by
pressing `Enter` on the related config. Settings are saved across sessions.

### Example Lazy Configuration
### Example Comprehensive Lazy Configuration


```lua
Expand Down Expand Up @@ -266,25 +249,38 @@ return {

opts = {
default_provider = "ollama"
-- default edgy flag
-- set this to true if you prefer to use edgy.nvim (https://github.com/folke/edgy.nvim) instead of floating windows
edgy = false,
providers = {
ollama= {
api_host = os.getenv("OLLAMA_API_HOST"),
-- default model
model = "mistral:7b"
-- model definitions
models = {
-- alias to actual model name, helpful to define same model name across multiple providers
coder = "deepseek-coder:6.7b",
-- nested alias
cool_coder = "coder",
general_model = "mistral:7b",
custom_coder = {
name = "deepseek-coder:6.7b",
modify_url = function(url)
-- completely modify the URL of a model, if necessary. This function is called
-- right before making the REST request
return url
end,
-- custom conform function
-- custom conform function. Each provider have a dedicated conform function where all
-- of OGPT chat info is passed into the conform function to be massaged to the
-- correct format that the provider is expecting. This function, if provided will
-- override the provider default conform function
-- conform_fn = function(ogpt_params)
-- return provider_specific_params
-- end,
},
},
-- default model params for all 'actions'
api_params = {
model = "mistral:7b",
temperature = 0.8,
Expand Down Expand Up @@ -333,6 +329,7 @@ return {
},
yank_register = "+",
edit = {
edgy = nil, -- use global default, override if defined
diff = false,
keymaps = {
close = "<C-c>",
Expand All @@ -344,16 +341,41 @@ return {
},
},
popup = {
edgy = nil, -- use global default, override if defined
position = 1,
size = {
width = "40%",
height = 10,
},
padding = { 1, 1, 1, 1 },
enter = true,
focusable = true,
zindex = 50,
border = {
style = "rounded",
},
buf_options = {
modifiable = false,
readonly = false,
filetype = "ogpt-popup",
syntax = "markdown",
},
win_options = {
wrap = true,
linebreak = true,
winhighlight = "Normal:Normal,FloatBorder:FloatBorder",
},
keymaps = {
close = { "<C-c>", "q" },
accept = "<M-CR>",
accept = "<C-CR>",
append = "a",
prepend = "p",
yank_code = "c",
yank_to_register = "y",
},
},
chat = {
edgy = nil, -- use global default, override if defined
welcome_message = WELCOME_MESSAGE,
loading_text = "Loading, please wait ...",
question_sign = "", -- 🙂
Expand Down Expand Up @@ -399,6 +421,7 @@ return {
},
},

-- {{input}} is always available as the selected/highlighted text
actions = {
grammar_correction = {
type = "popup",
Expand Down Expand Up @@ -436,7 +459,7 @@ return {
template = "Extract the main keywords from the following text to be used as document tags.\n\n```{{input}}```",
strategy = "display",
params = {
-- model = "mistral:7b",
model = "general_model", -- use of model alias, generally, this model alias should be available to all providers in use
temperature = 0.5,
frequency_penalty = 0.8,
},
Expand All @@ -456,6 +479,7 @@ return {
quick_question = {
type = "popup",
args = {
-- template expansion
question = {
type = "string",
optional = "true",
Expand Down Expand Up @@ -508,8 +532,8 @@ return {

### Advanced setup

`config.params.model` and `api_params.model` and `api_chat_params.model` can take a table instead
of a string.

#### Modify model REST URL

```lua
-- advanced model, can take the following structure
Expand Down Expand Up @@ -537,24 +561,37 @@ local advanced_model = {

```

After defining the advanced model can, you can place it directly into your previous model location
in the configuration.
#### Modify Conform Function
TBD


## OGPT planned work
+ [x] Use default provider, but can be overriden at anytime for specific action
+ [x] original functionality of ChatGPT.nvim to work with Ollama, TextGenUI(huggingface), OpenAI via `providers`
+ Look at the "default_provider" in the `config.lua`, default is `ollama`
+ look at "providers" for the provider default options
+ [x] Choose different provider and model for "edit" and "chat"
+ [x] Custom settings per session
+ [x] Add/remove parameters in Chat and Edit
+ [x] Choose provider, as well as model for Chat and Edit
+ [x] Customizable actions, with specific provider and model
+ [ ] Another Windows for [Template](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#template), [System](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#system)
+ [x] Framework to add more providers
+ [x] clean up documentation
+ [x] additional actions can be added to config options, or additional json. Look in "config.actions", and "config.actions_paths"
+ [x] running `OGPTRun` shows telescope picker
+ [x] for `type="popup"` and `strategy="display" -- or append, prepend, replace, quick_fix`, "r" and "a" can be used to "replace the
+ [x] model alias for each provider
highlighted text" or "append after the highlighted text", respectively. Otherwise, "esc" or
"ctrl-c" would exit the popup. You can update the mapping in your config options.



```lua
opts = {
...
api_params = {
-- so now call actions will use this model, unless explicitly overridden in the action itself
model = advanced_model,
temperature = 0.8,
top_p = 0.9,
},
}
```

# Credits
Thank you to the author of `jackMort/ChatGPT.nvim` for creating a seamless framework
to interact with OGPT in neovim!

Buy Me a Coffee
[!["Buy Me A Coffee"](https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png)](https://www.buymeacoffee.com/huynle)

86 changes: 32 additions & 54 deletions lua/ogpt/api.lua
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,11 @@ function Api:edits(custom_params, cb)
self:chat_completions(params, cb)
end

function Api:make_call(url, params, cb)
function Api:make_call(url, params, cb, ctx, raw_chunks, state)
ctx = ctx or {}
raw_chunks = raw_chunks or ""
state = state or "START"

TMP_MSG_FILENAME = os.tmpname()
local f = io.open(TMP_MSG_FILENAME, "w+")
if f == nil then
Expand Down Expand Up @@ -135,35 +139,34 @@ function Api:make_call(url, params, cb)
end

local result = table.concat(response:result(), "\n")
local ok, json = pcall(vim.fn.json_decode, result)
if not ok then
cb("// API ERROR: turn on debug ")
elseif ok and json.error then
cb("// API ERROR: " .. json.error)
elseif json == nil then
cb("No Response.")
else
local message = json.message
if message ~= nil then
local message_response
local first_message = json.message.content
if first_message.function_call then
message_response = vim.fn.json_decode(first_message.function_call.arguments)
else
message_response = first_message
end
if (type(message_response) == "string" and message_response ~= "") or type(message_response) == "table" then
cb(message_response, "CONTINUE")
else
cb("...")
end

local ok, json = pcall(vim.json.decode, result)
if ok then
if json.error ~= nil then
local error_msg = {
"OGPT ERROR:",
self.provider.name,
vim.inspect(json.error) or "",
"Something went wrong.",
}
table.insert(error_msg, vim.inspect(params))
-- local error_msg = "OGPT ERROR: " .. (json.error.message or "Something went wrong")
cb(table.concat(error_msg, " "), "ERROR", ctx)
return
end
ctx, raw_chunks, state = self.provider.process_line({ json = json, raw = result }, ctx, raw_chunks, state, cb)
return
end

for line in result:gmatch("[^\n]+") do
local raw_json = string.gsub(line, "^data:", "")
local _ok, _json = pcall(vim.json.decode, raw_json)
if _ok then
ctx, raw_chunks, state =
self.provider.process_line({ json = _json, raw = line }, ctx, raw_chunks, state, cb)
else
local response_text = json.response
if type(response_text) == "string" and response_text ~= "" then
cb(response_text, "CONTINUE")
else
cb("...")
end
ctx, raw_chunks, state =
self.provider.process_line({ json = _json, raw = line }, ctx, raw_chunks, state, cb)
end
end
end),
Expand Down Expand Up @@ -246,31 +249,6 @@ local function loadApiKey(envName, configName, optionName, callback, defaultValu
end
end

local function loadAzureConfigs()
loadApiKey("OPENAI_API_BASE", "OPENAI_API_BASE", "azure_api_base_cmd", function(value)
self.OPENAI_API_BASE = value
end)
loadApiKey("OPENAI_API_AZURE_ENGINE", "OPENAI_API_AZURE_ENGINE", "azure_api_engine_cmd", function(value)
self.OPENAI_API_AZURE_ENGINE = value
end)
loadApiHost("OPENAI_API_AZURE_VERSION", "OPENAI_API_AZURE_VERSION", "azure_api_version_cmd", function(value)
self.OPENAI_API_AZURE_VERSION = value
end, "2023-05-15")

if Api["OPENAI_API_BASE"] and Api["OPENAI_API_AZURE_ENGINE"] then
self.COMPLETIONS_URL = self.OPENAI_API_BASE
.. "/openai/deployments/"
.. self.OPENAI_API_AZURE_ENGINE
.. "/completions?api-version="
.. self.OPENAI_API_AZURE_VERSION
self.CHAT_COMPLETIONS_URL = self.OPENAI_API_BASE
.. "/openai/deployments/"
.. self.OPENAI_API_AZURE_ENGINE
.. "/chat/completions?api-version="
.. self.OPENAI_API_AZURE_VERSION
end
end

local function startsWith(str, start)
return string.sub(str, 1, string.len(start)) == start
end
Expand Down
Loading

0 comments on commit d4395cf

Please sign in to comment.