Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding support for Google Gemini #19

Merged
merged 106 commits into from
Feb 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
106 commits
Select commit Hold shift + click to select a range
2704179
Merge branch 'main' of https://github.com/huynle/ogpt.nvim
huynle Jan 3, 2024
1abb5ab
Merge branch 'main' of https://github.com/huynle/ogpt.nvim
huynle Jan 4, 2024
a47a732
adding textgenui
huynle Jan 8, 2024
1763a99
things are working
huynle Jan 9, 2024
497e307
update to take on textgenui API, and allow for custom functions to ru…
huynle Jan 9, 2024
28016b2
updating doc
huynle Jan 9, 2024
fc15161
more updates
huynle Jan 9, 2024
a040ea8
more updates
huynle Jan 9, 2024
995bc04
getting things updated
huynle Jan 9, 2024
dfd1f74
updating
huynle Jan 9, 2024
7925c14
Merge branch 'dev' of github.com:huynle/ogpt.nvim into dev
huynle Jan 11, 2024
19b3811
updating to have textgenui
huynle Jan 11, 2024
eb86381
updating readme
huynle Jan 11, 2024
b99311d
adding support for openai
huynle Jan 19, 2024
3b0e2c5
Adding support for OpenAI
huynle Jan 19, 2024
24ed496
updating readme
huynle Jan 19, 2024
2bd9c98
merged master
huynle Jan 19, 2024
0261260
Merge branch 'dev' of https://github.com/huynle/ogpt.nvim into dev
huynle Jan 22, 2024
0419dd1
Updating so error filetype can be used in edit
huynle Jan 25, 2024
b4b567e
Merge branch 'dev' of https://github.com/huynle/ogpt.nvim into dev
huynle Jan 25, 2024
0bd599b
updating to actions to use edit and popup
huynle Jan 25, 2024
515b357
saving
huynle Jan 25, 2024
f9a0585
overhaul, getting ready to allow to allow each action to have their o…
huynle Jan 26, 2024
b3f5704
chat now works with different providers
huynle Jan 27, 2024
cddfd29
chat is now working again
huynle Jan 27, 2024
c557d4a
cleaned up and ready for testing
huynle Jan 28, 2024
92bf185
cleaned up and ready for testing
huynle Jan 28, 2024
6193f34
overhaul complete
huynle Jan 28, 2024
23bb26f
more cleaning
huynle Jan 28, 2024
d8fe8b6
more updates
huynle Jan 29, 2024
295867e
Merge branch 'dev' of https://github.com/huynle/ogpt.nvim into dev
huynle Jan 29, 2024
1eef311
updating textgen
huynle Jan 29, 2024
8fa1e0a
fixing envs
huynle Jan 29, 2024
a3212cf
Merge branch 'dev' of github.com:huynle/ogpt.nvim into dev
huynle Jan 30, 2024
810c086
adding in simple_window, to allow for options to move away for NUI li…
huynle Jan 31, 2024
08553a1
Merge branch 'dev' of https://github.com/huynle/ogpt.nvim into dev
huynle Jan 31, 2024
fddc14d
should be able to load without actions
huynle Feb 1, 2024
fa6fa42
Merge branch 'dev' of github.com:huynle/ogpt.nvim into dev
huynle Feb 2, 2024
8062ea4
making changes to work with edgy
huynle Feb 3, 2024
4f13717
session was not being saved
huynle Feb 3, 2024
dd7baf1
fixed history issue with openai
huynle Feb 3, 2024
7c0da2f
forcing syntax on windows
huynle Feb 3, 2024
0492eb8
edgy is kinda working, need to fix parameters
huynle Feb 4, 2024
f7abfdf
toggling parameters panel is working again
huynle Feb 4, 2024
a4c0e71
all things should be working again
huynle Feb 4, 2024
4817027
update to make ogpt-input window have the correct filetype
huynle Feb 4, 2024
c6223de
Merge branch 'dev' of github.com:huynle/ogpt.nvim into dev
huynle Feb 4, 2024
c0c1577
fixing sessions
huynle Feb 4, 2024
9b2faa9
fixing sessions
huynle Feb 4, 2024
02ca357
Merge branch 'dev' of github.com:huynle/ogpt.nvim into dev
huynle Feb 4, 2024
5c8a186
updating docs
huynle Feb 6, 2024
7a9df3d
updating docs
huynle Feb 6, 2024
7ec31e2
disable filetype for edgy no true
huynle Feb 6, 2024
d4395cf
Merge branch 'dev' of https://github.com/huynle/ogpt.nvim into dev
huynle Feb 6, 2024
fda6061
updating for single window option so workspace doesnt get so cluttered
huynle Feb 7, 2024
a8b9ccc
Merge branch 'dev' of github.com:huynle/ogpt.nvim into dev
huynle Feb 7, 2024
c62822d
updating textgenui
huynle Feb 7, 2024
a1a71ed
Merge branch 'dev' of https://github.com/huynle/ogpt.nvim into dev
huynle Feb 7, 2024
8ba0fdc
pulled in dev
huynle Feb 7, 2024
9efd268
Merge branch 'dev' into textgenui
huynle Feb 7, 2024
e32f6ab
Merge branch 'textgenui' into dev
huynle Feb 7, 2024
813cd88
udpating params
huynle Feb 7, 2024
f879caf
fixing
huynle Feb 8, 2024
e66d853
make bufs unique
huynle Feb 8, 2024
17c81db
fixing up template
huynle Feb 9, 2024
f23a36e
getting parameters panel to update on session change
huynle Feb 9, 2024
35e3f8a
generalizing window
huynle Feb 9, 2024
55cd75b
more cleaning, removed selection panel
huynle Feb 9, 2024
3645fef
adding doc
huynle Feb 9, 2024
a681d60
updating documentation
huynle Feb 9, 2024
6ad8b2a
adding gemini
huynle Feb 9, 2024
58d906b
updating onchange
huynle Feb 9, 2024
8492414
Merge branch 'dev' into gemini
huynle Feb 9, 2024
35c44ae
updating provider to class
huynle Feb 10, 2024
25f21a4
saving
huynle Feb 10, 2024
92f7ff2
got structure of params content update to support gemini
huynle Feb 11, 2024
0638f8b
stable
huynle Feb 11, 2024
8a53132
gemini is working
huynle Feb 11, 2024
36b6003
trying to get response object to work?
huynle Feb 12, 2024
5f35be8
Merge branch 'gemini' of https://github.com/huynle/ogpt.nvim into gemini
huynle Feb 12, 2024
2af9109
Textgenui is working again?
huynle Feb 12, 2024
6bd426d
things are working agian?
huynle Feb 12, 2024
b69c7ad
things are almost working, streaming is not decoding correctly.
huynle Feb 12, 2024
5f852e5
ollama and textgenui are working
huynle Feb 12, 2024
cb4f389
things are working
huynle Feb 13, 2024
088a641
update to have gemini
huynle Feb 13, 2024
13394ba
things are generally working
huynle Feb 13, 2024
ecfbcb4
things are working, but ollama is not streaming
huynle Feb 13, 2024
eee4fdd
saving
huynle Feb 14, 2024
f36e9a6
Merge branch 'gemini' of github.com:huynle/ogpt.nvim into gemini
huynle Feb 14, 2024
9efcbfe
gemini is streaming!
huynle Feb 14, 2024
17fb65c
gemini chunking now working
huynle Feb 14, 2024
31f3342
fixing log levels
huynle Feb 14, 2024
c7724c1
pull in origin
huynle Feb 14, 2024
fe1932b
cleaned up partial
huynle Feb 14, 2024
e84c53d
cleaning, and trying to get stream = false
huynle Feb 14, 2024
3769ab1
cleaning
huynle Feb 15, 2024
74a0669
Merge branch 'gemini' of github.com:huynle/ogpt.nvim into gemini
huynle Feb 15, 2024
0dcff79
getting the response object to work correctly, for future code comple…
huynle Feb 15, 2024
79e0d8f
Merge branch 'gemini' of https://github.com/huynle/ogpt.nvim into gemini
huynle Feb 15, 2024
d335a0d
updating
huynle Feb 15, 2024
c0f93cd
fixing response
huynle Feb 16, 2024
e6432eb
simplying gemini parsing
huynle Feb 16, 2024
8188880
more docs
huynle Feb 16, 2024
f5ce006
updating doc
huynle Feb 16, 2024
7359ea4
merged main
huynle Feb 16, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
459 changes: 419 additions & 40 deletions README.md

Large diffs are not rendered by default.

Binary file added assets/images/edgy-example.png.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
174 changes: 82 additions & 92 deletions lua/ogpt/api.lua
Original file line number Diff line number Diff line change
Expand Up @@ -3,104 +3,88 @@ local Config = require("ogpt.config")
local logger = require("ogpt.common.logger")
local Object = require("ogpt.common.object")
local utils = require("ogpt.utils")
local Response = require("ogpt.response")

local Api = Object("Api")

Api.STATE_COMPLETED = "COMPLETED"

function Api:init(provider, action, opts)
self.opts = opts
self.provider = provider
self.action = action
end

function Api:completions(custom_params, cb)
function Api:completions(custom_params, cb, opts)
-- TODO: not working atm
local params = vim.tbl_extend("keep", custom_params, Config.options.api_params)
params.stream = false
self:make_call(self.COMPLETIONS_URL, params, cb)
self:make_call(self.COMPLETIONS_URL, params, cb, opts)
end

function Api:chat_completions(custom_params, partial_result_fn, should_stop, opts)
local stream = custom_params.stream or false
local params, _completion_url = Config.expand_model(self, custom_params)
function Api:chat_completions(response, inputs)
local custom_params = inputs.custom_params
local partial_result_fn = inputs.partial_result_fn
local should_stop = inputs.should_stop or function() end

-- local stream = custom_params.stream or false
local params, _completion_url, ctx = self.provider:expand_model(custom_params)

local ctx = {}
ctx.params = params
ctx.provider = self.provider.name
ctx.model = custom_params.model
utils.log("Request to: " .. _completion_url)
utils.log(params)
response.ctx = ctx
response.rest_params = params
response.partial_result_cb = partial_result_fn
response:run_async()

if stream then
local raw_chunks = ""
local state = "START"

partial_result_fn = vim.schedule_wrap(partial_result_fn)

self:exec(
"curl",
{
"--silent",
"--show-error",
"--no-buffer",
_completion_url,
"-H",
"Content-Type: application/json",
"-H",
self.provider.envs.AUTHORIZATION_HEADER,
"-d",
vim.json.encode(params),
},
function(chunk)
local ok, json = pcall(vim.json.decode, chunk)
if ok then
if json.error ~= nil then
local error_msg = {
"OGPT ERROR:",
self.provider.name,
vim.inspect(json.error) or "",
"Something went wrong.",
}
table.insert(error_msg, vim.inspect(params))
-- local error_msg = "OGPT ERROR: " .. (json.error.message or "Something went wrong")
partial_result_fn(table.concat(error_msg, " "), "ERROR", ctx)
return
end
ctx, raw_chunks, state =
self.provider.process_line({ json = json, raw = chunk }, ctx, raw_chunks, state, partial_result_fn)
return
end
local on_complete = inputs.on_complete or function()
response:set_state(response.STATE_COMPLETED)
end
local on_start = inputs.on_start
or function()
-- utils.log("Start Exec of: Curl " .. vim.inspect(curl_args), vim.log.levels.DEBUG)
response:set_state(response.STATE_INPROGRESS)
end
local on_error = inputs.on_error
or function(msg)
-- utils.log("Error running curl: " .. msg or "", vim.log.levels.ERROR)
response:set_state(response.STATE_ERROR)
end
local on_stop = inputs.on_stop or function()
response:set_state(response.STATE_STOPPED)
end

for line in chunk:gmatch("[^\n]+") do
local raw_json = string.gsub(line, "^data:", "")
local _ok, _json = pcall(vim.json.decode, raw_json)
if _ok then
ctx, raw_chunks, state =
self.provider.process_line({ json = _json, raw = line }, ctx, raw_chunks, state, partial_result_fn)
else
ctx, raw_chunks, state =
self.provider.process_line({ json = _json, raw = line }, ctx, raw_chunks, state, partial_result_fn)
end
end
end,
function(err, _)
partial_result_fn(err, "ERROR", ctx)
end,
should_stop,
function()
partial_result_fn(raw_chunks, "END", ctx)
end
)
else
params.stream = false
self:make_call(self.provider.envs.CHAT_COMPLETIONS_URL, params, partial_result_fn)
-- if params.stream then
-- local accumulate = {}
local curl_args = {
"--silent",
"--show-error",
"--no-buffer",
_completion_url,
"-d",
vim.json.encode(params),
}
for _, header_item in ipairs(self.provider:request_headers()) do
table.insert(curl_args, header_item)
end
end

function Api:edits(custom_params, cb)
local params = self.action.params
params.stream = true
params = vim.tbl_extend("force", params, custom_params)
self:chat_completions(params, cb)
self:exec("curl", curl_args, on_start, function(chunk)
response:add_chunk(chunk)
end, on_complete, on_error, on_stop, should_stop)
end

function Api:make_call(url, params, cb, ctx, raw_chunks, state)
-- function Api:edits(custom_params, cb)
-- local params = self.action.params
-- params.stream = true
-- params = vim.tbl_extend("force", params, custom_params)
-- self:chat_completions(params, cb)
-- end

function Api:make_call(url, params, cb, ctx, raw_chunks, state, opts)
-- TODO: to be deprecated
ctx = ctx or {}
raw_chunks = raw_chunks or ""
state = state or "START"
Expand All @@ -116,12 +100,9 @@ function Api:make_call(url, params, cb, ctx, raw_chunks, state)

local curl_args = {
url,
"-H",
"Content-Type: application/json",
"-H",
self.provider.envs.AUTHORIZATION_HEADER,
"-d",
"@" .. TMP_MSG_FILENAME,
table.unpack(self.provider:request_headers()),
}

self.job = job
Expand All @@ -135,7 +116,7 @@ function Api:make_call(url, params, cb, ctx, raw_chunks, state)
"An Error Occurred, when calling `curl " .. table.concat(curl_args, " ") .. "`",
vim.log.levels.ERROR
)
cb("ERROR: API Error")
cb("ERROR: API Error", "ERROR")
end

local result = table.concat(response:result(), "\n")
Expand All @@ -154,7 +135,8 @@ function Api:make_call(url, params, cb, ctx, raw_chunks, state)
cb(table.concat(error_msg, " "), "ERROR", ctx)
return
end
ctx, raw_chunks, state = self.provider.process_line({ json = json, raw = result }, ctx, raw_chunks, state, cb)
ctx, raw_chunks, state =
self.provider:process_line({ json = json, raw = result }, ctx, raw_chunks, state, cb, opts)
return
end

Expand All @@ -163,10 +145,10 @@ function Api:make_call(url, params, cb, ctx, raw_chunks, state)
local _ok, _json = pcall(vim.json.decode, raw_json)
if _ok then
ctx, raw_chunks, state =
self.provider.process_line({ json = _json, raw = line }, ctx, raw_chunks, state, cb)
self.provider:process_line({ json = _json, raw = line }, ctx, raw_chunks, state, cb, opts)
else
ctx, raw_chunks, state =
self.provider.process_line({ json = _json, raw = line }, ctx, raw_chunks, state, cb)
self.provider:process_line({ json = _json, raw = line }, ctx, raw_chunks, state, cb, opts)
end
end
end),
Expand Down Expand Up @@ -261,24 +243,29 @@ local function ensureUrlProtocol(str)
return "https://" .. str
end

function Api:exec(cmd, args, on_stdout_chunk, on_complete, should_stop, on_stop)
local stdout = vim.loop.new_pipe()
function Api:exec(cmd, args, on_start, on_stdout_chunk, on_complete, on_error, on_stop, should_stop)
local stderr = vim.loop.new_pipe()
local stdout = vim.loop.new_pipe()
local stderr_chunks = {}

local handle, err
local function on_stdout_read(_, chunk)
if chunk then
vim.schedule(function()
if should_stop and should_stop() then
if should_stop() then
if handle ~= nil then
handle:kill(2) -- send SIGINT
stdout:close()
stderr:close()
handle:close()
pcall(function()
stdout:close()
end)
pcall(function()
stderr:close()
end)
pcall(function()
handle:close()
end)
on_stop()
end
return
end
on_stdout_chunk(chunk)
end)
Expand All @@ -291,6 +278,7 @@ function Api:exec(cmd, args, on_stdout_chunk, on_complete, should_stop, on_stop)
end
end

on_start()
handle, err = vim.loop.spawn(cmd, {
args = args,
stdio = { nil, stdout, stderr },
Expand All @@ -303,13 +291,15 @@ function Api:exec(cmd, args, on_stdout_chunk, on_complete, should_stop, on_stop)

vim.schedule(function()
if code ~= 0 then
on_complete(vim.trim(table.concat(stderr_chunks, "")))
on_error()
else
on_complete()
end
end)
end)

if not handle then
on_complete(cmd .. " could not be started: " .. err)
on_error(cmd .. " could not be started: " .. err)
else
stdout:read_start(on_stdout_read)
stderr:read_start(on_stderr_read)
Expand Down
2 changes: 2 additions & 0 deletions lua/ogpt/common/popup.lua
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ function Popup:init(options, edgy)
if options.edgy and options.border or edgy then
self.edgy = true
options.border = nil
else
options.buf_options.filetype = nil
end
Popup.super.init(self, options)
end
Expand Down
Loading
Loading