Skip to content

Commit

Permalink
✨ Feature(custom): add support for google gemini pro api
Browse files Browse the repository at this point in the history
  • Loading branch information
Kuingsmile committed Dec 14, 2023
1 parent 1b4b035 commit 591b78d
Show file tree
Hide file tree
Showing 10 changed files with 504 additions and 219 deletions.
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ Word GPT Plus is a word add-in which integrates the chatGPT model into Microsoft
- Built-in prompts for translation, summarization, polishing, and academic writing
- Support Azure OpenAI API
- Support Google PALM2 API
- Support Google Gemini Pro API
- Support for multiple languages
- Custom prompts can be set and saved for future use
- Ability for users to set temperature and max tokens
Expand All @@ -46,6 +47,8 @@ You need to apply for qualification first, please go to [Azure OpenAI API applic

You need to go to [Google AI](https://developers.generativeai.google/) to apply for qualification for Google PALM2 API.

Google Gemini Pro API's api key is the same as Google PALM2 API's api key, and the free version is currently limited to 60 requests per minute.

## Getting Started

There are two ways to install Word GPT Plus: through my free hosting service, or by self-hosting it.
Expand Down
3 changes: 3 additions & 0 deletions README_cn.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ Word GPT Plus 是一个集成了 chatGPT 模型的 Word 插件。它允许你基
- 使用GPT API生成文本并支持选择模型
- 支持OpenAI官方API和Azure OpenAI API
- 支持Google PALM2 API
- 支持Google Gemini Pro API
- 内置用于翻译、总结、润色和学术写作的提示
- 支持多种语言
- 可以自定义提示并保存以供将来使用
Expand Down Expand Up @@ -46,6 +47,8 @@ Azure OpenAI需要首先申请资格,请前往[Azure OpenAI API申请网址](h

Google PALM2 API需要前往[Google AI](https://developers.generativeai.google/)申请,申请后目前公测阶段使用是免费的。

Google Gemini Pro API的api key与Google PALM2 API的api key相同,目前免费版限制次数一分钟60个请求。

## 快速开始

有两种方法可以安装 Word GPT Plus:通过我的免费web服务,或者自己搭建服务。
Expand Down
20 changes: 11 additions & 9 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
"version": "0.2.8",
"private": true,
"scripts": {
"dev": "vue-cli-service serve --port 3000",
"serve": "vue-cli-service serve --port 3000",
"build": "vue-cli-service build",
"lint": "vue-cli-service lint --fix",
Expand All @@ -12,24 +13,25 @@
},
"dependencies": {
"@azure/openai": "1.0.0-beta.4",
"@element-plus/icons-vue": "^2.1.0",
"@element-plus/icons-vue": "^2.3.1",
"@google/generative-ai": "^0.1.1",
"axios": "^1.6.2",
"chatgpt": "^5.2.5",
"core-js": "^3.33.3",
"dexie": "^3.2.4",
"element-plus": "^2.4.2",
"element-plus": "^2.4.3",
"openai": "^4.10.0",
"unfetch": "^5.0.0",
"vue": "^3.3.8",
"vue": "^3.3.11",
"vue-class-component": "^8.0.0-rc.1",
"vue-i18n": "^9.6.5",
"vue-i18n": "^9.8.0",
"vue-router": "^4.2.5"
},
"devDependencies": {
"@picgo/bump-version": "^1.1.2",
"@types/office-js": "^1.0.361",
"@typescript-eslint/eslint-plugin": "^6.10.0",
"@typescript-eslint/parser": "^6.10.0",
"@types/office-js": "^1.0.362",
"@typescript-eslint/eslint-plugin": "^6.14.0",
"@typescript-eslint/parser": "^6.14.0",
"@vue/cli-plugin-babel": "^5.0.8",
"@vue/cli-plugin-eslint": "^5.0.8",
"@vue/cli-plugin-router": "^5.0.8",
Expand All @@ -38,8 +40,8 @@
"@vue/eslint-config-standard": "^8.0.1",
"@vue/eslint-config-typescript": "^12.0.0",
"dpdm": "^3.14.0",
"eslint": "^8.53.0",
"eslint-plugin-vue": "^9.18.1",
"eslint": "^8.55.0",
"eslint-plugin-vue": "^9.19.2",
"stylus": "^0.61.0",
"stylus-loader": "^7.1.3",
"typescript": "^5.2.2"
Expand Down
70 changes: 70 additions & 0 deletions src/api/gemini.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
import { GoogleGenerativeAI } from '@google/generative-ai'
import { Ref } from 'vue'

interface ChatCompletionStreamOptions {
geminiAPIKey: string
messages: string
result: Ref<string>
historyDialog: Ref<any[]>
errorIssue: Ref<boolean>
loading: Ref<boolean>
maxTokens?: number
temperature?: number
geminiModel?: string
}

async function createChatCompletionStream (options: ChatCompletionStreamOptions): Promise<void> {
const apiKey = options.geminiAPIKey
const generationConfig = {
maxOutputTokens: options.maxTokens ?? 800,
temperature: options.temperature ?? 0.7
}
try {
const genAI = new GoogleGenerativeAI(apiKey)
const model = genAI.getGenerativeModel({
model: options.geminiModel ?? 'gemini-pro'
})
console.log('historyDialog', options.historyDialog.value)
const chat = model.startChat({
history: options.historyDialog.value,
generationConfig
})
const result = await chat.sendMessage(options.messages)
const response = await result.response
const text = response.text()
console.log('text', text)
console.log('response', response)
updateResultAndHistory(text, options.messages, options.result, options.historyDialog)
} catch (error: any) {
handleError(error, options.result, options.errorIssue)
}
options.loading.value = false
}

function updateResultAndHistory (
text: string,
userText: string,
result: Ref<string>,
historyDialog: Ref<any[]>
): void {
result.value = text
historyDialog.value.push(...[
{
role: 'user',
parts: userText
},
{
role: 'model',
parts: text
}])
}

function handleError (error: Error, result: Ref<string>, errorIssue: Ref<boolean>): void {
result.value = String(error)
errorIssue.value = true
console.error(error)
}

export default {
createChatCompletionStream
}
2 changes: 2 additions & 0 deletions src/api/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,13 @@ import official from './official'
import azure from './azure'
import palm from './palm'
import common from './common'
import gemini from './gemini'

export default {
webapi,
official,
azure,
palm,
gemini,
common
}
95 changes: 83 additions & 12 deletions src/pages/HomePage.vue
Original file line number Diff line number Diff line change
Expand Up @@ -201,7 +201,7 @@
{{ $t('start') }}
</el-button>
<el-button
v-if="['web-api', 'azure', 'official'].includes(api)"
v-if="['web-api', 'azure', 'official', 'gemini'].includes(api)"
class="api-button"
type="success"
size="default"
Expand Down Expand Up @@ -389,25 +389,26 @@
<script lang="ts" setup>
import { onBeforeMount, ref } from 'vue'
import { useRouter } from 'vue-router'
import { localStorageKey, languageMap, buildInPrompt, availableModels, availableModelsForPalm } from '@/utils/constant'
import { localStorageKey, languageMap, buildInPrompt, availableModels, availableModelsForPalm, availableModelsForGemini } from '@/utils/constant'
import { promptDbInstance } from '@/store/promtStore'
import { IStringKeyMap } from '@/types'
import { CirclePlus, Remove } from '@element-plus/icons-vue'
import { ElMessage } from 'element-plus'
import { ChatGPTUnofficialProxyAPI, ChatMessage } from 'chatgpt'
import { checkAuth } from '@/utils/common'
import { checkAuth, forceNumber } from '@/utils/common'
import API from '@/api'
const replyLanguageList = Object.values(languageMap).map((key) => ({
label: key,
value: key
}))
const api = ref<'web-api' | 'official' | 'azure' | 'palm'>('official')
const api = ref<'web-api' | 'official' | 'azure' | 'palm' | 'gemini'>('official')
const apiKey = ref('')
const accessToken = ref('')
const azureAPIKey = ref('')
const palmAPIKey = ref('')
const geminiAPIKey = ref('')
const localLanguage = ref('en')
const replyLanguage = ref('English')
Expand All @@ -429,6 +430,10 @@ const palmMaxTokens = ref(800)
const palmTemperature = ref(0.7)
const palmModel = ref('text-bison-001')
const geminiMaxTokens = ref(800)
const geminiTemperature = ref(0.7)
const geminiModel = ref('gemini-pro')
const systemPrompt = ref('')
const systemPromptSelected = ref('')
const systemPromptList = ref<IStringKeyMap[]>([])
Expand Down Expand Up @@ -540,16 +545,17 @@ function handelPromptChange (val: string) {
}
onBeforeMount(async () => {
api.value = localStorage.getItem(localStorageKey.api) as 'web-api' | 'official' | 'azure' | 'palm' || 'official'
api.value = localStorage.getItem(localStorageKey.api) as 'web-api' | 'official' | 'azure' | 'palm' | 'gemini' || 'official'
replyLanguage.value = localStorage.getItem(localStorageKey.replyLanguage) || 'English'
localLanguage.value = localStorage.getItem(localStorageKey.localLanguage) || 'en'
apiKey.value = localStorage.getItem(localStorageKey.apiKey) || ''
accessToken.value = localStorage.getItem(localStorageKey.accessToken) || ''
azureAPIKey.value = localStorage.getItem(localStorageKey.azureAPIKey) || ''
palmAPIKey.value = localStorage.getItem(localStorageKey.palmAPIKey) || ''
geminiAPIKey.value = localStorage.getItem(localStorageKey.geminiAPIKey) || ''
webModel.value = localStorage.getItem(localStorageKey.webModel) || 'default'
temperature.value = Number(localStorage.getItem(localStorageKey.temperature)) || 0.7
maxTokens.value = Number(localStorage.getItem(localStorageKey.maxTokens)) || 800
temperature.value = forceNumber(localStorage.getItem(localStorageKey.temperature)) || 0.7
maxTokens.value = forceNumber(localStorage.getItem(localStorageKey.maxTokens)) || 800
const modelTemp = localStorage.getItem(localStorageKey.model) || availableModels['gpt-3.5']
if (Object.keys(availableModels).includes(modelTemp)) {
model.value = availableModels[modelTemp]
Expand All @@ -561,11 +567,11 @@ onBeforeMount(async () => {
basePath.value = localStorage.getItem(localStorageKey.basePath) || ''
azureAPIEndpoint.value = localStorage.getItem(localStorageKey.azureAPIEndpoint) || ''
azureDeploymentName.value = localStorage.getItem(localStorageKey.azureDeploymentName) || ''
azureMaxTokens.value = Number(localStorage.getItem(localStorageKey.azureMaxTokens)) || 800
azureTemperature.value = Number(localStorage.getItem(localStorageKey.azureTemperature)) || 0.7
azureMaxTokens.value = forceNumber(localStorage.getItem(localStorageKey.azureMaxTokens)) || 800
azureTemperature.value = forceNumber(localStorage.getItem(localStorageKey.azureTemperature)) || 0.7
palmAPIEndpoint.value = localStorage.getItem(localStorageKey.palmAPIEndpoint) || 'https://generativelanguage.googleapis.com/v1beta2'
palmMaxTokens.value = Number(localStorage.getItem(localStorageKey.palmMaxTokens)) || 800
palmTemperature.value = Number(localStorage.getItem(localStorageKey.palmTemperature)) || 0.7
palmMaxTokens.value = forceNumber(localStorage.getItem(localStorageKey.palmMaxTokens)) || 800
palmTemperature.value = forceNumber(localStorage.getItem(localStorageKey.palmTemperature)) || 0.7
const palmModelTemp = localStorage.getItem(localStorageKey.palmModel) || availableModelsForPalm['text-bison-001']
if (Object.keys(availableModelsForPalm).includes(palmModelTemp)) {
palmModel.value = availableModelsForPalm[palmModelTemp]
Expand All @@ -574,6 +580,16 @@ onBeforeMount(async () => {
} else {
palmModel.value = availableModelsForPalm['text-bison-001']
}
geminiMaxTokens.value = forceNumber(localStorage.getItem(localStorageKey.geminiMaxTokens)) || 800
geminiTemperature.value = forceNumber(localStorage.getItem(localStorageKey.geminiTemperature)) || 0.7
const geminiModelTemp = localStorage.getItem(localStorageKey.geminiModel) || availableModelsForGemini['gemini-pro']
if (Object.keys(availableModelsForGemini).includes(geminiModelTemp)) {
geminiModel.value = availableModelsForGemini[geminiModelTemp]
} else if (Object.values(availableModelsForGemini).includes(geminiModelTemp)) {
geminiModel.value = geminiModelTemp
} else {
geminiModel.value = availableModelsForGemini['gemini-pro']
}
insertType.value = localStorage.getItem(localStorageKey.insertType) || 'replace' as 'replace' | 'append' | 'newLine' | 'NoAction'
systemPrompt.value = localStorage.getItem(localStorageKey.defaultSystemPrompt) || 'Act like a personal assistant.'
await getSystemPromptList()
Expand Down Expand Up @@ -690,6 +706,30 @@ async function template (taskType: keyof typeof buildInPrompt | 'custom') {
palmMaxTokens.value,
palmTemperature.value
)
} else if (api.value === 'gemini' && geminiAPIKey.value) {
historyDialog.value = [
{
role: 'user',
parts: systemMessage + '\n' + userMessage
},
{
role: 'model',
parts: 'Hi, what can I help you?'
}
]
await API.gemini.createChatCompletionStream(
{
geminiAPIKey: geminiAPIKey.value,
messages: userMessage,
result,
historyDialog,
errorIssue,
loading,
maxTokens: geminiMaxTokens.value,
temperature: geminiTemperature.value,
geminiModel: geminiModel.value
}
)
} else {
ElMessage.error('Set API Key or Access Token first')
return
Expand All @@ -710,7 +750,8 @@ function checkApiKey () {
accessToken: accessToken.value,
apiKey: apiKey.value,
azureAPIKey: azureAPIKey.value,
palmAPIKey: palmAPIKey.value
palmAPIKey: palmAPIKey.value,
geminiAPIKey: geminiAPIKey.value
}
if (!checkAuth(auth)) {
ElMessage.error('Set API Key or Access Token first')
Expand Down Expand Up @@ -798,6 +839,36 @@ async function continueChat () {
errorIssue.value = true
console.error(error)
}
} else if (api.value === 'gemini') {
try {
historyDialog.value.push(...[
{
role: 'user',
parts: 'continue'
},
{
role: 'model',
parts: 'OK, I will continue to help you.'
}
])
await API.gemini.createChatCompletionStream(
{
geminiAPIKey: geminiAPIKey.value,
messages: 'continue',
result,
historyDialog,
errorIssue,
loading,
maxTokens: geminiMaxTokens.value,
temperature: geminiTemperature.value,
geminiModel: geminiModel.value
}
)
} catch (error) {
result.value = String(error)
errorIssue.value = true
console.error(error)
}
} else if (api.value === 'web-api') {
try {
const config = API.webapi.setUnofficalConfig(accessToken.value)
Expand Down
Loading

0 comments on commit 591b78d

Please sign in to comment.