Skip to content

Commit

Permalink
Merge pull request #707 from PrefectHQ/better-api-key-discovery
Browse files Browse the repository at this point in the history
allow use of `OPENAI_API_KEY` env var
  • Loading branch information
zzstoatzz authored Jan 1, 2024
2 parents 18f814e + c067e4c commit 905e710
Show file tree
Hide file tree
Showing 5 changed files with 73 additions and 45 deletions.
15 changes: 6 additions & 9 deletions docs/configuration/settings.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,13 @@ Marvin makes use of Pydantic's `BaseSettings` to configure, load, and change beh
## Environment Variables
All settings are configurable via environment variables like `MARVIN_<setting name>`.

Please set Marvin specific settings in `~/.marvin/.env`. One exception being `OPENAI_API_KEY`, which may be as a global env var on your system and it will be picked up by Marvin.

!!! example "Setting Environment Variables"
For example, in an `.env` file or in your shell config file you might have:
For example, in your `~/.marvin/.env` file you could have:
```shell
MARVIN_LOG_LEVEL=DEBUG
MARVIN_LLM_MODEL=gpt-4
MARVIN_LLM_TEMPERATURE=0
MARVIN_LOG_LEVEL=INFO
MARVIN_OPENAI_CHAT_COMPLETIONS_MODEL=gpt-4
MARVIN_OPENAI_API_KEY='sk-my-api-key'
```
Settings these values will let you avoid setting an API key every time.
Expand All @@ -23,10 +24,6 @@ A runtime settings object is accessible via `marvin.settings` and can be used to
```python
import marvin

marvin.settings.llm_model # 'gpt-4'

marvin.settings.llm_model = 'gpt-3.5-turbo'

marvin.settings.llm_model # 'gpt-3.5-turbo'
marvin.settings.openai_chat_completions_model = 'gpt-4'
```

10 changes: 7 additions & 3 deletions docs/static/css/tailwind.css
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
/*
! tailwindcss v3.3.6 | MIT License | https://tailwindcss.com
! tailwindcss v3.4.0 | MIT License | https://tailwindcss.com
*/

/*
Expand Down Expand Up @@ -32,9 +32,11 @@
4. Use the user's configured `sans` font-family by default.
5. Use the user's configured `sans` font-feature-settings by default.
6. Use the user's configured `sans` font-variation-settings by default.
7. Disable tap highlights on iOS
*/

html {
html,
:host {
line-height: 1.5;
/* 1 */
-webkit-text-size-adjust: 100%;
Expand All @@ -44,12 +46,14 @@ html {
-o-tab-size: 4;
tab-size: 4;
/* 3 */
font-family: ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji";
font-family: ui-sans-serif, system-ui, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji";
/* 4 */
font-feature-settings: normal;
/* 5 */
font-variation-settings: normal;
/* 6 */
-webkit-tap-highlight-color: transparent;
/* 7 */
}

/*
Expand Down
43 changes: 25 additions & 18 deletions docs/welcome/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,29 @@ After [installing Marvin](../installation), the fastest way to get started is by
!!! info "Initializing a Client"
To use Marvin you must have an API Key configured for an external model provider, like OpenAI.

You can pass your API Key to Marvin in one of two ways:

- Set the environment variable `MARVIN_OPENAI_API_KEY` in `~/.marvin/.env` or as `OPENAI_API_KEY` in your shell config file.

```shell
» cat ~/.marvin/.env | rg OPENAI
MARVIN_OPENAI_API_KEY=sk-xxx
MARVIN_OPENAI_ORGANIZATION=org-xxx
```

- Pass your API Key to Marvin's `OpenAI` client constructor and pass it to Marvin's `ai_fn`, `ai_classifier`, or `ai_model` decorators.

```python
from marvin import ai_fn
from openai import OpenAI

client = OpenAI(api_key = 'YOUR_API_KEY')

@ai_fn(client = client)
def list_fruits(n: int, color: str = 'red') -> list[str]:
"""
Generates a list of {{n}} {{color}} fruits.
"""
```

## Components
Expand All @@ -23,11 +42,8 @@ Marvin's most basic component is the AI Model, built on Pydantic's `BaseModel`.
```python
from marvin import ai_model
from pydantic import BaseModel, Field
from openai import OpenAI

client = OpenAI(api_key = 'YOUR_API_KEY')

@ai_model(client = client)
@ai_model
class Location(BaseModel):
city: str
state_abbreviation: str = Field(
Expand Down Expand Up @@ -120,9 +136,6 @@ Marvin's most basic component is the AI Model, built on Pydantic's `BaseModel`.
```python
from marvin import ai_model
from pydantic import BaseModel, Field
from openai import OpenAI

client = OpenAI(api_key = 'YOUR_API_KEY')

class Location(BaseModel):
city: str
Expand All @@ -131,12 +144,12 @@ Marvin's most basic component is the AI Model, built on Pydantic's `BaseModel`.
description="The two-letter state abbreviation"
)

ai_model(Location, client = client)("The Big Apple")
ai_model(Location)("The Big Apple")
```
??? info "Generated Prompt"
You can view and/or eject the generated prompt by simply calling
```python
ai_model(Location, client = client)("The Big Apple").as_prompt().serialize()
ai_model(Location)("The Big Apple").as_prompt().serialize()
```
When you do you'll see the raw payload that's sent to the LLM. All of the parameters
below like `FormatResponse` and the prompt you send are fully customizable.
Expand Down Expand Up @@ -358,11 +371,8 @@ AI Functions look like regular functions, but have no source code. Instead, an A
`ai_fn` can decorate python functions to evlaute them using a Large Language Model.
```python
from marvin import ai_fn
from openai import OpenAI

client = OpenAI(api_key = 'YOUR_API_KEY')

@ai_fn(client=client)
@ai_fn
def sentiment_list(texts: list[str]) -> list[float]:
"""
Given a list of `texts`, returns a list of numbers between 1 (positive) and
Expand Down Expand Up @@ -437,9 +447,6 @@ AI Functions look like regular functions, but have no source code. Instead, an A
`ai_fn` can be used as a utility function to evaluate python functions using a Large Language Model.
```python
from marvin import ai_fn
from openai import OpenAI

client = OpenAI(api_key = 'YOUR_API_KEY')

def sentiment_list(texts: list[str]) -> list[float]:
"""
Expand All @@ -448,7 +455,7 @@ AI Functions look like regular functions, but have no source code. Instead, an A
"""


ai_fn(sentiment_list, client=client)(
ai_fn(sentiment_list)(
[
"That was surprisingly easy!",
"Oh no, not again.",
Expand All @@ -458,7 +465,7 @@ AI Functions look like regular functions, but have no source code. Instead, an A
??? info "Generated Prompt"
You can view and/or eject the generated prompt by simply calling
```python
ai_fn(sentiment_list, client = client)([
ai_fn(sentiment_list)([
"That was surprisingly easy!",
"Oh no, not again.",
]).as_prompt().serialize()
Expand Down
37 changes: 23 additions & 14 deletions src/marvin/client/openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,27 @@
T = TypeVar("T", bound=pydantic.BaseModel)


def _get_default_client(client_type: str) -> Union[Client, AsyncClient]:
api_key = (
settings.openai.api_key.get_secret_value() if settings.openai.api_key else None
)
if not api_key:
raise ValueError(
"OpenAI API key not found. Please either set `MARVIN_OPENAI_API_KEY` in `~/.marvin/.env`"
" or otherwise set `OPENAI_API_KEY` in your environment."
)
if client_type not in ["sync", "async"]:
raise ValueError(f"Invalid client type {client_type!r}")

client_class = Client if client_type == "sync" else AsyncClient
return client_class(
**settings.openai.model_dump(
exclude={"chat", "images", "audio", "assistants", "api_key"}
)
| {"api_key": api_key}
)


def with_response_model(
create: Callable[P, "ChatCompletion"],
) -> Callable[
Expand Down Expand Up @@ -82,14 +103,7 @@ class MarvinClient(pydantic.BaseModel):
arbitrary_types_allowed=True, protected_namespaces=()
)

client: Client = pydantic.Field(
default_factory=lambda: Client(
**settings.openai.model_dump(
exclude={"chat", "images", "audio", "assistants", "api_key"}
)
| dict(api_key=settings.openai.api_key.get_secret_value())
)
)
client: Client = pydantic.Field(default_factory=lambda: _get_default_client("sync"))

@classmethod
def wrap(cls, client: Client) -> "Client":
Expand Down Expand Up @@ -170,12 +184,7 @@ class AsyncMarvinClient(pydantic.BaseModel):
)

client: AsyncClient = pydantic.Field(
default_factory=lambda: AsyncClient(
**settings.openai.model_dump(
exclude={"chat", "images", "audio", "assistants", "api_key"}
)
| dict(api_key=settings.openai.api_key.get_secret_value())
)
default_factory=lambda: _get_default_client("async")
)

@classmethod
Expand Down
13 changes: 12 additions & 1 deletion src/marvin/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from copy import deepcopy
from typing import Any, Optional, Union

from pydantic import Field, SecretStr
from pydantic import Field, SecretStr, field_validator
from pydantic_settings import BaseSettings, SettingsConfigDict
from typing_extensions import Literal

Expand Down Expand Up @@ -187,6 +187,17 @@ class OpenAISettings(MarvinSettings):
audio: AudioSettings = Field(default_factory=AudioSettings)
assistants: AssistantSettings = Field(default_factory=AssistantSettings)

@field_validator("api_key")
def discover_api_key(cls, v):
if v is None:
v = SecretStr(os.environ.get("OPENAI_API_KEY"))
if v.get_secret_value() is None:
raise ValueError(
"OpenAI API key not found. Please either set `MARVIN_OPENAI_API_KEY` in `~/.marvin/.env`"
" or otherwise set `OPENAI_API_KEY` in your environment."
)
return v


class Settings(MarvinSettings):
"""Settings for `marvin`.
Expand Down

0 comments on commit 905e710

Please sign in to comment.