Skip to content

Commit

Permalink
Improve readme
Browse files Browse the repository at this point in the history
  • Loading branch information
johnd0e committed Nov 26, 2024
1 parent 1c0a0f3 commit 5134c13
Showing 1 changed file with 16 additions and 3 deletions.
19 changes: 16 additions & 3 deletions readme.MD
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ but there are many tools that work exclusively with the OpenAI API.

This project provides a personal OpenAI-compatible endpoint for free.


## Serverless?

Although it runs in the cloud, it does not require server maintenance.
Expand All @@ -15,6 +16,7 @@ It can be easily deployed to various providers for free
> Running the proxy endpoint locally is also an option,
> though it's more appropriate for development use.

## How to start

You will need a personal Google [API key](https://makersuite.google.com/app/apikey).
Expand All @@ -29,6 +31,7 @@ You will need to set up an account there.
If you opt for “button-deploy”, you'll be guided through the process of forking the repository first,
which is necessary for continuous integration (CI).


### Deploy with Vercel

[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https://github.com/PublicAffairs/openai-gemini&repository-name=my-openai-gemini)
Expand All @@ -37,6 +40,7 @@ which is necessary for continuous integration (CI).
- Serve locally: `vercel dev`
- Vercel _Functions_ [limitations](https://vercel.com/docs/functions/limitations) (with _Edge_ runtime)


### Deploy to Netlify

[![Deploy to Netlify](https://www.netlify.com/img/deploy/button.svg)](https://app.netlify.com/start/deploy?repository=https://github.com/PublicAffairs/openai-gemini&integrationName=integrationName&integrationSlug=integrationSlug&integrationDescription=integrationDescription)
Expand All @@ -49,6 +53,7 @@ which is necessary for continuous integration (CI).
- `/edge/v1`
_Edge functions_ [limits](https://docs.netlify.com/edge-functions/limits/)


### Deploy to Cloudflare

[![Deploy to Cloudflare Workers](https://deploy.workers.cloudflare.com/button)](https://deploy.workers.cloudflare.com/?url=https://github.com/PublicAffairs/openai-gemini)
Expand All @@ -59,12 +64,19 @@ which is necessary for continuous integration (CI).
- Serve locally: `wrangler dev`
- _Worker_ [limits](https://developers.cloudflare.com/workers/platform/limits/#worker-limits)


### Deploy to Deno

See details [here](https://github.com/PublicAffairs/openai-gemini/discussions/19).


### Serve locally - with Node, Deno, Bun

Only for Node: `npm install`.

Then `npm run start` / `npm run start:deno` / `npm run start:bun`.


#### Dev mode (watch source changes)

Only for Node: `npm install --include=dev`
Expand All @@ -89,13 +101,14 @@ Alternatively, it could be in some config file (check the relevant documentation

For some command-line tools, you may need to set an environment variable, _e.g._:
```sh
OPENAI_BASE_URL=https://my-super-proxy.vercel.app/v1
OPENAI_BASE_URL="https://my-super-proxy.vercel.app/v1"
```
_..or_:
```sh
OPENAI_API_BASE=https://my-super-proxy.vercel.app/v1
OPENAI_API_BASE="https://my-super-proxy.vercel.app/v1"
```


## Models

Requests use the specified [model] if its name starts with "gemini-", "learnlm-",
Expand All @@ -118,7 +131,7 @@ Implemented via [`inlineData`](https://ai.google.dev/api/caching#Part).

---

## Possible further development
## Supported API endpoints and applicable parameters

- [x] `chat/completions`

Expand Down

0 comments on commit 5134c13

Please sign in to comment.