You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Anyways, I would like to know the correct way to set up a key, I'll help write the instructions for this, though I haven't been able to figure out how...
An example of the secrets file can be found at ./app/.streamlit/secrets.toml.example:
OPENAI_API_KEY = "<YOUR API KEY HERE>"
This file should be copied or renamed to secrets.toml and contain your own OpenAI API key.
Keep in mind that even though it looks like it's running locally, you're still using OpenAI's API, so requests are sent over the internet and there's actually no LLM running locally.
2) Running an LLM locally
If you want to run an LLM locally, check out this repo I created just a few days ago: https://github.com/ricardobalk/streamlit-ollama - not as fancy as this Streamlit app, but it provides a basic setup allowing to run llama2/llama3 locally.
Hi! I just wanted to test it out, and I got this error:
This is run locally, right? Or do I require some sort of key?
The text was updated successfully, but these errors were encountered: