Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

release: 0.2.0-alpha.42 #407

Merged
merged 9 commits into from
Dec 19, 2024
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.2.0-alpha.41"
".": "0.2.0-alpha.42"
}
23 changes: 23 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,29 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## 0.2.0-alpha.42 (2024-12-18)

Full Changelog: [v0.2.0-alpha.41...v0.2.0-alpha.42](https://github.com/openlayer-ai/openlayer-python/compare/v0.2.0-alpha.41...v0.2.0-alpha.42)

### Features

* **api:** api update ([#412](https://github.com/openlayer-ai/openlayer-python/issues/412)) ([f6ca1fc](https://github.com/openlayer-ai/openlayer-python/commit/f6ca1fcbc7ed85d6e3bdc635b8f7a4796c943e2a))


### Chores

* **internal:** codegen related update ([#406](https://github.com/openlayer-ai/openlayer-python/issues/406)) ([3360b9e](https://github.com/openlayer-ai/openlayer-python/commit/3360b9e6f6037c7bc9ce877f7ae430ca249e9b95))
* **internal:** codegen related update ([#408](https://github.com/openlayer-ai/openlayer-python/issues/408)) ([9bab516](https://github.com/openlayer-ai/openlayer-python/commit/9bab5168085e325ac7b8b4f07643f39ef564d78d))
* **internal:** codegen related update ([#409](https://github.com/openlayer-ai/openlayer-python/issues/409)) ([f59c50e](https://github.com/openlayer-ai/openlayer-python/commit/f59c50ebd7b298536f0a6a92437630551074e172))
* **internal:** codegen related update ([#410](https://github.com/openlayer-ai/openlayer-python/issues/410)) ([7e4304a](https://github.com/openlayer-ai/openlayer-python/commit/7e4304a87d8330fc15b099a078412f0dbab78842))
* **internal:** fix some typos ([#414](https://github.com/openlayer-ai/openlayer-python/issues/414)) ([1009b11](https://github.com/openlayer-ai/openlayer-python/commit/1009b11b627a4236137c76543e2a09cc4fc78557))
* **internal:** updated imports ([#411](https://github.com/openlayer-ai/openlayer-python/issues/411)) ([90c6218](https://github.com/openlayer-ai/openlayer-python/commit/90c6218e0a9929f8672da20f1871f20aab9bb500))


### Documentation

* **readme:** example snippet for client context manager ([#413](https://github.com/openlayer-ai/openlayer-python/issues/413)) ([4ef9f75](https://github.com/openlayer-ai/openlayer-python/commit/4ef9f75dfea53f198af9768414b51027ec9bd553))

## 0.2.0-alpha.41 (2024-12-13)

Full Changelog: [v0.2.0-alpha.40...v0.2.0-alpha.41](https://github.com/openlayer-ai/openlayer-python/compare/v0.2.0-alpha.40...v0.2.0-alpha.41)
Expand Down
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -414,6 +414,16 @@ client.with_options(http_client=DefaultHttpxClient(...))

By default the library closes underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__). You can manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting.

```py
from openlayer import Openlayer

with Openlayer() as client:
# make requests here
...

# HTTP client is now closed
```

## Versioning

This package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions, though certain backwards-incompatible changes may be released as minor versions:
Expand Down
2 changes: 1 addition & 1 deletion api.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ from openlayer.types import InferencePipelineRetrieveResponse, InferencePipeline

Methods:

- <code title="get /inference-pipelines/{inferencePipelineId}">client.inference_pipelines.<a href="./src/openlayer/resources/inference_pipelines/inference_pipelines.py">retrieve</a>(inference_pipeline_id) -> <a href="./src/openlayer/types/inference_pipeline_retrieve_response.py">InferencePipelineRetrieveResponse</a></code>
- <code title="get /inference-pipelines/{inferencePipelineId}">client.inference_pipelines.<a href="./src/openlayer/resources/inference_pipelines/inference_pipelines.py">retrieve</a>(inference_pipeline_id, \*\*<a href="src/openlayer/types/inference_pipeline_retrieve_params.py">params</a>) -> <a href="./src/openlayer/types/inference_pipeline_retrieve_response.py">InferencePipelineRetrieveResponse</a></code>
- <code title="put /inference-pipelines/{inferencePipelineId}">client.inference_pipelines.<a href="./src/openlayer/resources/inference_pipelines/inference_pipelines.py">update</a>(inference_pipeline_id, \*\*<a href="src/openlayer/types/inference_pipeline_update_params.py">params</a>) -> <a href="./src/openlayer/types/inference_pipeline_update_response.py">InferencePipelineUpdateResponse</a></code>
- <code title="delete /inference-pipelines/{inferencePipelineId}">client.inference_pipelines.<a href="./src/openlayer/resources/inference_pipelines/inference_pipelines.py">delete</a>(inference_pipeline_id) -> None</code>

Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "openlayer"
version = "0.2.0-alpha.41"
version = "0.2.0-alpha.42"
description = "The official Python library for the openlayer API"
dynamic = ["readme"]
license = "Apache-2.0"
Expand Down
77 changes: 43 additions & 34 deletions src/openlayer/_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

import httpx

from . import resources, _exceptions
from . import _exceptions
from ._qs import Querystring
from ._types import (
NOT_GIVEN,
Expand All @@ -32,13 +32,16 @@
SyncAPIClient,
AsyncAPIClient,
)
from .resources.commits import commits
from .resources.storage import storage
from .resources.projects import projects
from .resources.inference_pipelines import inference_pipelines

__all__ = [
"Timeout",
"Transport",
"ProxiesTypes",
"RequestOptions",
"resources",
"Openlayer",
"AsyncOpenlayer",
"Client",
Expand All @@ -47,10 +50,10 @@


class Openlayer(SyncAPIClient):
projects: resources.ProjectsResource
commits: resources.CommitsResource
inference_pipelines: resources.InferencePipelinesResource
storage: resources.StorageResource
projects: projects.ProjectsResource
commits: commits.CommitsResource
inference_pipelines: inference_pipelines.InferencePipelinesResource
storage: storage.StorageResource
with_raw_response: OpenlayerWithRawResponse
with_streaming_response: OpenlayerWithStreamedResponse

Expand Down Expand Up @@ -104,10 +107,10 @@ def __init__(
_strict_response_validation=_strict_response_validation,
)

self.projects = resources.ProjectsResource(self)
self.commits = resources.CommitsResource(self)
self.inference_pipelines = resources.InferencePipelinesResource(self)
self.storage = resources.StorageResource(self)
self.projects = projects.ProjectsResource(self)
self.commits = commits.CommitsResource(self)
self.inference_pipelines = inference_pipelines.InferencePipelinesResource(self)
self.storage = storage.StorageResource(self)
self.with_raw_response = OpenlayerWithRawResponse(self)
self.with_streaming_response = OpenlayerWithStreamedResponse(self)

Expand Down Expand Up @@ -230,10 +233,10 @@ def _make_status_error(


class AsyncOpenlayer(AsyncAPIClient):
projects: resources.AsyncProjectsResource
commits: resources.AsyncCommitsResource
inference_pipelines: resources.AsyncInferencePipelinesResource
storage: resources.AsyncStorageResource
projects: projects.AsyncProjectsResource
commits: commits.AsyncCommitsResource
inference_pipelines: inference_pipelines.AsyncInferencePipelinesResource
storage: storage.AsyncStorageResource
with_raw_response: AsyncOpenlayerWithRawResponse
with_streaming_response: AsyncOpenlayerWithStreamedResponse

Expand Down Expand Up @@ -287,10 +290,10 @@ def __init__(
_strict_response_validation=_strict_response_validation,
)

self.projects = resources.AsyncProjectsResource(self)
self.commits = resources.AsyncCommitsResource(self)
self.inference_pipelines = resources.AsyncInferencePipelinesResource(self)
self.storage = resources.AsyncStorageResource(self)
self.projects = projects.AsyncProjectsResource(self)
self.commits = commits.AsyncCommitsResource(self)
self.inference_pipelines = inference_pipelines.AsyncInferencePipelinesResource(self)
self.storage = storage.AsyncStorageResource(self)
self.with_raw_response = AsyncOpenlayerWithRawResponse(self)
self.with_streaming_response = AsyncOpenlayerWithStreamedResponse(self)

Expand Down Expand Up @@ -414,36 +417,42 @@ def _make_status_error(

class OpenlayerWithRawResponse:
def __init__(self, client: Openlayer) -> None:
self.projects = resources.ProjectsResourceWithRawResponse(client.projects)
self.commits = resources.CommitsResourceWithRawResponse(client.commits)
self.inference_pipelines = resources.InferencePipelinesResourceWithRawResponse(client.inference_pipelines)
self.storage = resources.StorageResourceWithRawResponse(client.storage)
self.projects = projects.ProjectsResourceWithRawResponse(client.projects)
self.commits = commits.CommitsResourceWithRawResponse(client.commits)
self.inference_pipelines = inference_pipelines.InferencePipelinesResourceWithRawResponse(
client.inference_pipelines
)
self.storage = storage.StorageResourceWithRawResponse(client.storage)


class AsyncOpenlayerWithRawResponse:
def __init__(self, client: AsyncOpenlayer) -> None:
self.projects = resources.AsyncProjectsResourceWithRawResponse(client.projects)
self.commits = resources.AsyncCommitsResourceWithRawResponse(client.commits)
self.inference_pipelines = resources.AsyncInferencePipelinesResourceWithRawResponse(client.inference_pipelines)
self.storage = resources.AsyncStorageResourceWithRawResponse(client.storage)
self.projects = projects.AsyncProjectsResourceWithRawResponse(client.projects)
self.commits = commits.AsyncCommitsResourceWithRawResponse(client.commits)
self.inference_pipelines = inference_pipelines.AsyncInferencePipelinesResourceWithRawResponse(
client.inference_pipelines
)
self.storage = storage.AsyncStorageResourceWithRawResponse(client.storage)


class OpenlayerWithStreamedResponse:
def __init__(self, client: Openlayer) -> None:
self.projects = resources.ProjectsResourceWithStreamingResponse(client.projects)
self.commits = resources.CommitsResourceWithStreamingResponse(client.commits)
self.inference_pipelines = resources.InferencePipelinesResourceWithStreamingResponse(client.inference_pipelines)
self.storage = resources.StorageResourceWithStreamingResponse(client.storage)
self.projects = projects.ProjectsResourceWithStreamingResponse(client.projects)
self.commits = commits.CommitsResourceWithStreamingResponse(client.commits)
self.inference_pipelines = inference_pipelines.InferencePipelinesResourceWithStreamingResponse(
client.inference_pipelines
)
self.storage = storage.StorageResourceWithStreamingResponse(client.storage)


class AsyncOpenlayerWithStreamedResponse:
def __init__(self, client: AsyncOpenlayer) -> None:
self.projects = resources.AsyncProjectsResourceWithStreamingResponse(client.projects)
self.commits = resources.AsyncCommitsResourceWithStreamingResponse(client.commits)
self.inference_pipelines = resources.AsyncInferencePipelinesResourceWithStreamingResponse(
self.projects = projects.AsyncProjectsResourceWithStreamingResponse(client.projects)
self.commits = commits.AsyncCommitsResourceWithStreamingResponse(client.commits)
self.inference_pipelines = inference_pipelines.AsyncInferencePipelinesResourceWithStreamingResponse(
client.inference_pipelines
)
self.storage = resources.AsyncStorageResourceWithStreamingResponse(client.storage)
self.storage = storage.AsyncStorageResourceWithStreamingResponse(client.storage)


Client = Openlayer
Expand Down
2 changes: 1 addition & 1 deletion src/openlayer/_version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

__title__ = "openlayer"
__version__ = "0.2.0-alpha.41" # x-release-please-version
__version__ = "0.2.0-alpha.42" # x-release-please-version
27 changes: 23 additions & 4 deletions src/openlayer/resources/inference_pipelines/inference_pipelines.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@

from __future__ import annotations

from typing import Optional
from typing import List, Optional
from typing_extensions import Literal

import httpx

Expand All @@ -22,7 +23,7 @@
RowsResourceWithStreamingResponse,
AsyncRowsResourceWithStreamingResponse,
)
from ...types import inference_pipeline_update_params
from ...types import inference_pipeline_update_params, inference_pipeline_retrieve_params
from ..._types import NOT_GIVEN, Body, Query, Headers, NoneType, NotGiven
from ..._utils import (
maybe_transform,
Expand Down Expand Up @@ -87,6 +88,7 @@ def retrieve(
self,
inference_pipeline_id: str,
*,
expand: List[Literal["project", "workspace"]] | NotGiven = NOT_GIVEN,
# Use the following arguments if you need to pass additional parameters to the API that aren't available via kwargs.
# The extra values given here take precedence over values defined on the client or passed to this method.
extra_headers: Headers | None = None,
Expand All @@ -98,6 +100,8 @@ def retrieve(
Retrieve inference pipeline.

Args:
expand: Expand specific nested objects.

extra_headers: Send extra headers

extra_query: Add additional query parameters to the request
Expand All @@ -113,7 +117,13 @@ def retrieve(
return self._get(
f"/inference-pipelines/{inference_pipeline_id}",
options=make_request_options(
extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout
extra_headers=extra_headers,
extra_query=extra_query,
extra_body=extra_body,
timeout=timeout,
query=maybe_transform(
{"expand": expand}, inference_pipeline_retrieve_params.InferencePipelineRetrieveParams
),
),
cast_to=InferencePipelineRetrieveResponse,
)
Expand Down Expand Up @@ -244,6 +254,7 @@ async def retrieve(
self,
inference_pipeline_id: str,
*,
expand: List[Literal["project", "workspace"]] | NotGiven = NOT_GIVEN,
# Use the following arguments if you need to pass additional parameters to the API that aren't available via kwargs.
# The extra values given here take precedence over values defined on the client or passed to this method.
extra_headers: Headers | None = None,
Expand All @@ -255,6 +266,8 @@ async def retrieve(
Retrieve inference pipeline.

Args:
expand: Expand specific nested objects.

extra_headers: Send extra headers

extra_query: Add additional query parameters to the request
Expand All @@ -270,7 +283,13 @@ async def retrieve(
return await self._get(
f"/inference-pipelines/{inference_pipeline_id}",
options=make_request_options(
extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout
extra_headers=extra_headers,
extra_query=extra_query,
extra_body=extra_body,
timeout=timeout,
query=await async_maybe_transform(
{"expand": expand}, inference_pipeline_retrieve_params.InferencePipelineRetrieveParams
),
),
cast_to=InferencePipelineRetrieveResponse,
)
Expand Down
8 changes: 8 additions & 0 deletions src/openlayer/resources/projects/inference_pipelines.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,8 @@ def create(
*,
description: Optional[str],
name: str,
project: Optional[inference_pipeline_create_params.Project] | NotGiven = NOT_GIVEN,
workspace: Optional[inference_pipeline_create_params.Workspace] | NotGiven = NOT_GIVEN,
# Use the following arguments if you need to pass additional parameters to the API that aren't available via kwargs.
# The extra values given here take precedence over values defined on the client or passed to this method.
extra_headers: Headers | None = None,
Expand Down Expand Up @@ -84,6 +86,8 @@ def create(
{
"description": description,
"name": name,
"project": project,
"workspace": workspace,
},
inference_pipeline_create_params.InferencePipelineCreateParams,
),
Expand Down Expand Up @@ -173,6 +177,8 @@ async def create(
*,
description: Optional[str],
name: str,
project: Optional[inference_pipeline_create_params.Project] | NotGiven = NOT_GIVEN,
workspace: Optional[inference_pipeline_create_params.Workspace] | NotGiven = NOT_GIVEN,
# Use the following arguments if you need to pass additional parameters to the API that aren't available via kwargs.
# The extra values given here take precedence over values defined on the client or passed to this method.
extra_headers: Headers | None = None,
Expand Down Expand Up @@ -204,6 +210,8 @@ async def create(
{
"description": description,
"name": name,
"project": project,
"workspace": workspace,
},
inference_pipeline_create_params.InferencePipelineCreateParams,
),
Expand Down
1 change: 1 addition & 0 deletions src/openlayer/types/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,5 +7,6 @@
from .project_list_response import ProjectListResponse as ProjectListResponse
from .project_create_response import ProjectCreateResponse as ProjectCreateResponse
from .inference_pipeline_update_params import InferencePipelineUpdateParams as InferencePipelineUpdateParams
from .inference_pipeline_retrieve_params import InferencePipelineRetrieveParams as InferencePipelineRetrieveParams
from .inference_pipeline_update_response import InferencePipelineUpdateResponse as InferencePipelineUpdateResponse
from .inference_pipeline_retrieve_response import InferencePipelineRetrieveResponse as InferencePipelineRetrieveResponse
13 changes: 13 additions & 0 deletions src/openlayer/types/inference_pipeline_retrieve_params.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

from __future__ import annotations

from typing import List
from typing_extensions import Literal, TypedDict

__all__ = ["InferencePipelineRetrieveParams"]


class InferencePipelineRetrieveParams(TypedDict, total=False):
expand: List[Literal["project", "workspace"]]
"""Expand specific nested objects."""
Loading
Loading