Replies: 1 comment
-
No, you can't use your ChatGPT Pro Key to prompt the Gorilla LLM. UC Berkeley Sky lab have hosted the Gorilla LLM for free as a research prototype. You may easily prompt the hosted Gorilla LLM like explained in this Google Colab: # Import Chat completion template and set-up variables
!pip install openai==0.28.1 &> /dev/null
import openai
import urllib.parse
openai.api_key = "EMPTY" # Key is ignored and does not matter
openai.api_base = "http://zanino.millennium.berkeley.edu:8000/v1"
# Alternate mirrors
# openai.api_base = "http://34.132.127.197:8000/v1"
# Report issues
def raise_issue(e, model, prompt):
issue_title = urllib.parse.quote("[bug] Hosted Gorilla: <Issue>")
issue_body = urllib.parse.quote(f"Exception: {e}\nFailed model: {model}, for prompt: {prompt}")
issue_url = f"https://github.com/ShishirPatil/gorilla/issues/new?assignees=&labels=hosted-gorilla&projects=&template=hosted-gorilla-.md&title={issue_title}&body={issue_body}"
print(f"An exception has occurred: {e} \nPlease raise an issue here: {issue_url}")
# Query Gorilla server
def get_gorilla_response(prompt="I would like to translate from English to French.", model="gorilla-7b-hf-v1"):
try:
completion = openai.ChatCompletion.create(
model=model,
messages=[{"role": "user", "content": prompt}]
)
return completion.choices[0].message.content
except Exception as e:
raise_issue(e, model, prompt) Alternatively, you can download any of the Gorilla LLMs from HuggingFace and use them locally. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can I add my ChatGPT Pro Key to the config and responses? If where is that located and how?
Beta Was this translation helpful? Give feedback.
All reactions