Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini Completion feature #12

Draft
wants to merge 6 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
## [0.4.4] - 2024-10-31
- Define the Ruby version for the gem
- Add Completion feature for Gemini

## [0.2.0] - 2023-10-22
- Add image generation abilities

Expand Down
3 changes: 2 additions & 1 deletion Gemfile.lock
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
PATH
remote: .
specs:
gen-ai (0.4.3)
gen-ai (0.4.4)
activesupport (~> 7.1)
faraday (~> 2.7)
faraday-multipart (~> 1.0)
Expand Down Expand Up @@ -143,6 +143,7 @@ GEM
PLATFORMS
arm64-darwin-22
arm64-darwin-23
x86_64-darwin-22
x86_64-linux

DEPENDENCIES
Expand Down
19 changes: 15 additions & 4 deletions lib/gen_ai/language/gemini.rb
Original file line number Diff line number Diff line change
Expand Up @@ -21,29 +21,40 @@ def initialize(token:, options: {})
)
end

def complete(prompt, options = {}); end
def complete(prompt, options = {})
response = @client.generate_content(generate_completion_options(prompt, options))

build_result(model: model(options), raw: response, parsed: extract_completions(response))
end

def chat(messages, options = {}, &block)
if block_given?
response = @client.stream_generate_content(
generate_options(messages, options), server_sent_events: true, &chunk_process_block(block)
generate_chat_options(messages, options), server_sent_events: true, &chunk_process_block(block)
)
build_result(model: model(options), raw: response.first, parsed: extract_completions(response).flatten)
else
response = @client.generate_content(generate_options(messages, options))
response = @client.generate_content(generate_chat_options(messages, options))
build_result(model: model(options), raw: response, parsed: extract_completions(response))
end
end

private

def generate_options(messages, options)
def generate_chat_options(messages, options)
{
contents: format_messages(messages),
generationConfig: options.except(:model)
}
end

def generate_completion_options(prompt, options)
{
contents: [{ role: DEFAULT_ROLE, parts: [text: prompt] }],
generationConfig: options.except(:model)
}
end

def model(options)
options[:model] || COMPLETION_MODEL
end
Expand Down
2 changes: 1 addition & 1 deletion lib/gen_ai/version.rb
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# frozen_string_literal: true

module GenAI
VERSION = '0.4.3'
VERSION = '0.4.4'
end
105 changes: 105 additions & 0 deletions spec/fixtures/cassettes/gemini/language/complete_default_prompt.yml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

30 changes: 30 additions & 0 deletions spec/language/gemini/completion_spec.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# frozen_string_literal: true

RSpec.describe GenAI::Language do
describe 'Gemini' do
describe '#completion' do
let(:provider) { :gemini }
let(:token) { ENV['API_ACCESS_TOKEN'] || 'FAKE_TOKEN' }
let(:instance) { described_class.new(provider, token) }
let(:cassette) { 'gemini/language/complete_default_prompt' }

subject { instance.complete('Hello') }

it 'returns completions' do
VCR.use_cassette(cassette) do
expect(subject).to be_a(GenAI::Result)

expect(subject.provider).to eq(:gemini)
expect(subject.model).to eq('gemini-pro')

expect(subject.value).to eq('Hi there! How can I assist you today?')
expect(subject.values).to eq(['Hi there! How can I assist you today?'])

expect(subject.prompt_tokens).to eq(nil)
expect(subject.completion_tokens).to eq(nil)
expect(subject.total_tokens).to eq(nil)
end
end
end
end
end
Loading