-
Notifications
You must be signed in to change notification settings - Fork 7
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- copied code from petersalomonsen/near-openai#11, which adds a [spin](https://github.com/fermyon/spin) app with a proxy to the OpenAI API, using the Fungible Token contract to cover usage cost - hash `conversation_id` when calling the smart contract from the proxy to ensure that the smart contract does not reveal the conversation id that the AI proxy accepts - added a test for the [aiconversation.js](https://github.com/petersalomonsen/quickjs-rust-near/blob/master/examples/fungibletoken/e2e/aiconversation.js) script that also shows the full process of starting a conversation and to get the refund when done
- Loading branch information
1 parent
5094835
commit 4a38d5e
Showing
16 changed files
with
2,911 additions
and
5 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
name: "AI proxy tests" | ||
|
||
on: | ||
push: | ||
branches: [ master ] | ||
pull_request: | ||
branches: [ master ] | ||
|
||
jobs: | ||
test: | ||
name: "AI proxy" | ||
runs-on: ubuntu-latest | ||
steps: | ||
- uses: actions/checkout@v2 | ||
- name: Install dependencies | ||
run: | | ||
curl -fsSL https://developer.fermyon.com/downloads/install.sh | bash | ||
sudo mv spin /usr/local/bin/ | ||
rustup target add wasm32-wasi | ||
cargo install cargo-component --locked | ||
spin plugin install -y -u https://github.com/fermyon/spin-test/releases/download/canary/spin-test.json | ||
- name: Check formatting | ||
working-directory: examples/aiproxy | ||
run: | | ||
(cd openai-proxy && cargo fmt --check) | ||
(cd tests && cargo fmt --check) | ||
- name: Build project | ||
working-directory: examples/aiproxy | ||
run: | | ||
spin build | ||
- name: Run tests | ||
working-directory: examples/aiproxy | ||
run: | | ||
spin test |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
target |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
# AI Proxy | ||
|
||
This folder contains a [Spin](https://www.fermyon.com/spin) application, based on the WASI 2 and the WebAssembly Component Model ( https://component-model.bytecodealliance.org/ ). It is implemented in Rust as a serverless proxy for the OpenAI API. | ||
|
||
There is a simple example of a web client in the [web](./web/) folder. | ||
|
||
The application will keep track of of token usage per conversation in the built-in key-value storage of Spin. The initial balance for a conversation is retrieved from the Fungible Token smart contract. | ||
|
||
To launch the application, make sure to have the Spin SDK installed. Set the environment variable `SPIN_VARIABLE_OPENAI_API_KEY` to your OpenAI API key. | ||
|
||
Then run the following commands: | ||
|
||
``` | ||
spin build | ||
spin up | ||
``` | ||
|
||
This will start the OpenAI proxy server at http://localhost:3000 | ||
|
||
You can also launch the web client using for example [http-server](https://www.npmjs.com/package/http-server): | ||
|
||
``` | ||
http-server web | ||
``` | ||
|
||
You will then find the web client at http://localhost:8080. Here you can have a conversation with the AI model. |
Oops, something went wrong.