Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Update AI documentation (box/box-openapi#432) #192

Merged
merged 4 commits into from
Jun 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .codegen.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{ "engineHash": "3ed6b56", "specHash": "5e023b9", "version": "1.0.0" }
{ "engineHash": "240a5b0", "specHash": "ee83bc7", "version": "1.0.0" }
48 changes: 28 additions & 20 deletions box_sdk_gen/managers/ai.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,9 +62,9 @@ def __init__(
**kwargs
):
"""
:param id: The id of the item
:param id: The id of the item.
:type id: str
:param type: The type of the item, defaults to CreateAiAskItemsTypeField.FILE.value
:param type: The type of the item., defaults to CreateAiAskItemsTypeField.FILE.value
:type type: CreateAiAskItemsTypeField, optional
:param content: The content of the item, often the text representation., defaults to None
:type content: Optional[str], optional
Expand Down Expand Up @@ -148,15 +148,19 @@ def create_ai_ask(
extra_headers: Optional[Dict[str, Optional[str]]] = None
) -> AiResponse:
"""
Sends an AI request to supported LLMs and returns an answer specifically focused on the user's question given the provided context.
:param mode: The mode specifies if this request is for a single or multiple items.
:type mode: CreateAiAskMode
:param prompt: The prompt provided by the client to be answered by the LLM.
:type prompt: str
:param items: The items to be processed by the LLM, often files.
:type items: List[CreateAiAskItems]
:param extra_headers: Extra headers that will be included in the HTTP request., defaults to None
:type extra_headers: Optional[Dict[str, Optional[str]]], optional
Sends an AI request to supported LLMs and returns an answer specifically focused on the user's question given the provided context.
:param mode: The mode specifies if this request is for a single or multiple items. If you select `single_item_qa` the `items` array can have one element only. Selecting `multiple_item_qa` allows you to provide up to 25 items.
:type mode: CreateAiAskMode
:param prompt: The prompt provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
:type prompt: str
:param items: The items to be processed by the LLM, often files.

**Note**: Box AI handles documents with text representations up to 1MB in size, or a maximum of 25 files, whichever comes first.
If the file size exceeds 1MB, the first 1MB of text representation will be processed.
If you set `mode` parameter to `single_item_qa`, the `items` array can have one element only.
:type items: List[CreateAiAskItems]
:param extra_headers: Extra headers that will be included in the HTTP request., defaults to None
:type extra_headers: Optional[Dict[str, Optional[str]]], optional
"""
if extra_headers is None:
extra_headers = {}
Expand Down Expand Up @@ -185,15 +189,19 @@ def create_ai_text_gen(
extra_headers: Optional[Dict[str, Optional[str]]] = None
) -> AiResponse:
"""
Sends an AI request to supported LLMs and returns an answer specifically focused on the creation of new text.
:param prompt: The prompt provided by the client to be answered by the LLM.
:type prompt: str
:param items: The items to be processed by the LLM, often files.
:type items: List[CreateAiTextGenItems]
:param dialogue_history: The history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response., defaults to None
:type dialogue_history: Optional[List[CreateAiTextGenDialogueHistory]], optional
:param extra_headers: Extra headers that will be included in the HTTP request., defaults to None
:type extra_headers: Optional[Dict[str, Optional[str]]], optional
Sends an AI request to supported LLMs and returns an answer specifically focused on the creation of new text.
:param prompt: The prompt provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
:type prompt: str
:param items: The items to be processed by the LLM, often files.
The array can include **exactly one** element.

**Note**: Box AI handles documents with text representations up to 1MB in size.
If the file size exceeds 1MB, the first 1MB of text representation will be processed.
:type items: List[CreateAiTextGenItems]
:param dialogue_history: The history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response., defaults to None
:type dialogue_history: Optional[List[CreateAiTextGenDialogueHistory]], optional
:param extra_headers: Extra headers that will be included in the HTTP request., defaults to None
:type extra_headers: Optional[Dict[str, Optional[str]]], optional
"""
if extra_headers is None:
extra_headers = {}
Expand Down
20 changes: 12 additions & 8 deletions box_sdk_gen/schemas/ai_ask.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ def __init__(
**kwargs
):
"""
:param id: The id of the item
:param id: The id of the item.
:type id: str
:param type: The type of the item, defaults to AiAskItemsTypeField.FILE.value
:param type: The type of the item., defaults to AiAskItemsTypeField.FILE.value
:type type: AiAskItemsTypeField, optional
:param content: The content of the item, often the text representation., defaults to None
:type content: Optional[str], optional
Expand All @@ -46,12 +46,16 @@ def __init__(
self, mode: AiAskModeField, prompt: str, items: List[AiAskItemsField], **kwargs
):
"""
:param mode: The mode specifies if this request is for a single or multiple items.
:type mode: AiAskModeField
:param prompt: The prompt provided by the client to be answered by the LLM.
:type prompt: str
:param items: The items to be processed by the LLM, often files.
:type items: List[AiAskItemsField]
:param mode: The mode specifies if this request is for a single or multiple items. If you select `single_item_qa` the `items` array can have one element only. Selecting `multiple_item_qa` allows you to provide up to 25 items.
:type mode: AiAskModeField
:param prompt: The prompt provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
:type prompt: str
:param items: The items to be processed by the LLM, often files.

**Note**: Box AI handles documents with text representations up to 1MB in size, or a maximum of 25 files, whichever comes first.
If the file size exceeds 1MB, the first 1MB of text representation will be processed.
If you set `mode` parameter to `single_item_qa`, the `items` array can have one element only.
:type items: List[AiAskItemsField]
"""
super().__init__(**kwargs)
self.mode = mode
Expand Down
16 changes: 10 additions & 6 deletions box_sdk_gen/schemas/ai_text_gen.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,12 +71,16 @@ def __init__(
**kwargs
):
"""
:param prompt: The prompt provided by the client to be answered by the LLM.
:type prompt: str
:param items: The items to be processed by the LLM, often files.
:type items: List[AiTextGenItemsField]
:param dialogue_history: The history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response., defaults to None
:type dialogue_history: Optional[List[AiTextGenDialogueHistoryField]], optional
:param prompt: The prompt provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
:type prompt: str
:param items: The items to be processed by the LLM, often files.
The array can include **exactly one** element.

**Note**: Box AI handles documents with text representations up to 1MB in size.
If the file size exceeds 1MB, the first 1MB of text representation will be processed.
:type items: List[AiTextGenItemsField]
:param dialogue_history: The history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response., defaults to None
:type dialogue_history: Optional[List[AiTextGenDialogueHistoryField]], optional
"""
super().__init__(**kwargs)
self.prompt = prompt
Expand Down
18 changes: 9 additions & 9 deletions docs/ai.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# AiManager

- [Send AI Ask request](#send-ai-ask-request)
- [Send AI Text Gen request](#send-ai-text-gen-request)
- [Send AI question request](#send-ai-question-request)
- [Send AI request to generate text](#send-ai-request-to-generate-text)

## Send AI Ask request
## Send AI question request

Sends an AI request to supported LLMs and returns an answer specifically focused on the user's question given the provided context.

Expand Down Expand Up @@ -36,11 +36,11 @@ client.ai.create_ai_ask(
### Arguments

- mode `CreateAiAskMode`
- The mode specifies if this request is for a single or multiple items.
- The mode specifies if this request is for a single or multiple items. If you select `single_item_qa` the `items` array can have one element only. Selecting `multiple_item_qa` allows you to provide up to 25 items.
- prompt `str`
- The prompt provided by the client to be answered by the LLM.
- The prompt provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
- items `List[CreateAiAskItems]`
- The items to be processed by the LLM, often files.
- The items to be processed by the LLM, often files. **Note**: Box AI handles documents with text representations up to 1MB in size, or a maximum of 25 files, whichever comes first. If the file size exceeds 1MB, the first 1MB of text representation will be processed. If you set `mode` parameter to `single_item_qa`, the `items` array can have one element only.
- extra_headers `Optional[Dict[str, Optional[str]]]`
- Extra headers that will be included in the HTTP request.

Expand All @@ -50,7 +50,7 @@ This function returns a value of type `AiResponse`.

A successful response including the answer from the LLM.

## Send AI Text Gen request
## Send AI request to generate text

Sends an AI request to supported LLMs and returns an answer specifically focused on the creation of new text.

Expand Down Expand Up @@ -89,9 +89,9 @@ client.ai.create_ai_text_gen(
### Arguments

- prompt `str`
- The prompt provided by the client to be answered by the LLM.
- The prompt provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
- items `List[CreateAiTextGenItems]`
- The items to be processed by the LLM, often files.
- The items to be processed by the LLM, often files. The array can include **exactly one** element. **Note**: Box AI handles documents with text representations up to 1MB in size. If the file size exceeds 1MB, the first 1MB of text representation will be processed.
- dialogue_history `Optional[List[CreateAiTextGenDialogueHistory]]`
- The history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response.
- extra_headers `Optional[Dict[str, Optional[str]]]`
Expand Down
7 changes: 6 additions & 1 deletion docs/events.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,12 @@ See the endpoint docs at
<!-- sample get_events -->

```python
client.events.get_events(stream_type=GetEventsStreamType.CHANGES.value)
client.events.get_events(
stream_type=GetEventsStreamType.ADMIN_LOGS.value,
limit=1,
created_after=created_after_date,
created_before=created_before_date,
)
```

### Arguments
Expand Down
16 changes: 16 additions & 0 deletions test/events.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@

from box_sdk_gen.schemas.realtime_server import RealtimeServer

from box_sdk_gen.internal.utils import DateTime

from test.commons import get_default_client

from box_sdk_gen.schemas.event_source import EventSource
Expand All @@ -24,6 +26,8 @@

from box_sdk_gen.schemas.user import User

from box_sdk_gen.internal.utils import date_time_from_string

client: BoxClient = get_default_client()


Expand Down Expand Up @@ -81,3 +85,15 @@ def testGetEventsWithLongPolling():
server: RealtimeServer = servers.entries[0]
assert to_string(server.type) == 'realtime_server'
assert not server.url == ''


def testGetEventsWithDateFilters():
created_after_date: DateTime = date_time_from_string('2024-06-09T00:00:00Z')
created_before_date: DateTime = date_time_from_string('2025-06-09T00:00:00Z')
servers: Events = client.events.get_events(
stream_type=GetEventsStreamType.ADMIN_LOGS.value,
limit=1,
created_after=created_after_date,
created_before=created_before_date,
)
assert len(servers.entries) == 1
Loading