Skip to content

Commit

Permalink
Merge branch 'kubeflow:master' into s390x
Browse files Browse the repository at this point in the history
  • Loading branch information
R3hankhan committed May 1, 2024
2 parents 3dc8b11 + e9d6876 commit 0b5e3a3
Show file tree
Hide file tree
Showing 34 changed files with 166 additions and 44 deletions.
53 changes: 53 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,58 @@
# Changelog

## [2.2.0](https://github.com/kubeflow/pipelines/compare/2.1.0...2.2.0) (2024-04-30)


### Features

* **backend:** add namespace & prefix scoped credentials to kfp-launcher config for object store paths ([\#10625](https://github.com/kubeflow/pipelines/issues/10625)) ([5e0f9b1](https://github.com/kubeflow/pipelines/commit/5e0f9b188e2ff0b312a9a77cb07b792f8ddc6a82))
* **backend:** Merge kfp-tekton backend code ([\#10678](https://github.com/kubeflow/pipelines/issues/10678)) ([60a443e](https://github.com/kubeflow/pipelines/commit/60a443e93b565cc5b1283f291c9b84db201e438f))
* **backend:** Upgrade argo to v3.4.16 ([\#10568](https://github.com/kubeflow/pipelines/issues/10568)) ([809d576](https://github.com/kubeflow/pipelines/commit/809d5766fc9ec436ff05c083e9a2ae65ad2667b7))
* **components:** Add model name preprocess component; Use publisher model if user uploaded model is non-tuned ([084f2c2](https://github.com/kubeflow/pipelines/commit/084f2c22295f92e407c283c0d524ffb693a11a4e))
* **components:** add resolve_machine_spec and resolve_refined_image_uri to rlhf_preprocessor component ([2a8d39e](https://github.com/kubeflow/pipelines/commit/2a8d39ec68affe508008eb2e3c91abe52a198c18))
* **components:** add resolve_reference_model_metadata to rlhf_preprocessor component ([92a7969](https://github.com/kubeflow/pipelines/commit/92a7969318c7439b7f60188837e8a76e012a1945))
* **components:** add task_type as a parameter to rlaif ([64d288a](https://github.com/kubeflow/pipelines/commit/64d288a2f531b1ea0450328304c80d79f0508e14))
* **components:** Added support for text-bison@002 to preview.llm.rlhf_pipeline ([2f27751](https://github.com/kubeflow/pipelines/commit/2f27751d0fd0e4db6eda372605380a2b9225072a))
* **components:** AutoSxS GA pending release ([aee464c](https://github.com/kubeflow/pipelines/commit/aee464c92da2dddadef5c9f7c29e5e58154a9898))
* **components:** Expand regions supported by `preview.llm.rlhf_pipeline` ([22a98d9](https://github.com/kubeflow/pipelines/commit/22a98d9f8de728a18c071bf7fa560bd141b03cbb))
* **components:** internal ([a4f01b7](https://github.com/kubeflow/pipelines/commit/a4f01b70f27bcb1a4318bd1c86282e1957e7324a))
* **components:** Introduce placeholders: SERVICE_ACCOUNT_PLACEHOLDER, NETWORK_PLACEHOLDER, PERSISTENT_RESOURCE_ID_PLACEHOLDER and ENCYRPTION_SPEC_KMS_KEY_NAME_PLACEHOLDER. In addition, use PERSISTENT_RESOURCE_ID_PLACEHOLDER as the default value of persistent_resource_id for CustomTrainingJobOp and create_custom_training_job_op_from_component. With this change, custom job created without explicitly setting persistent_resource_id will inherit job level persistent_resource_id, if Persistent Resource is set as job level runtime ([67d3cd6](https://github.com/kubeflow/pipelines/commit/67d3cd6dbc0569d0050ee11bbcca9bcd80e457fb))
* **components:** migrate function_based convert_to_delimited_string to rlhf_preprocessor component ([efefe34](https://github.com/kubeflow/pipelines/commit/efefe346f0a97004e5bd000c0e68d06e7d8f0b4b))
* **components:** migrate function_based resolve_num_microbatches to rlhf_preprocessor component ([ee28c72](https://github.com/kubeflow/pipelines/commit/ee28c72893a0bbe1963d6b6f158937e1f4a0651d))
* **components:** migrate function_based resolve_regional_endpoint to rlhf_preprocessor component ([f175c71](https://github.com/kubeflow/pipelines/commit/f175c71aea461455451f9de22780be922ae706d3))
* **components:** Move AutoSxS pipeline to v1 directory ([d919ae7](https://github.com/kubeflow/pipelines/commit/d919ae7216b60efdd08441eee64bc18ad8f30e70))
* **components:** Move ModelImportEvaluationOp component to preview namespace ([33db128](https://github.com/kubeflow/pipelines/commit/33db1284f57b5b277c95d4a44b35b1fdd830bd18))
* **components:** Report TensorBoard metrics for `preview.llm.rlhf_pipeline` in real time ([3d8069b](https://github.com/kubeflow/pipelines/commit/3d8069bf2c9c4eecca3df2e45da4d4fa2ed43af5))
* **components:** Use larger base reward model when tuning `t5-xxl` with the `preview.llm.rlhf_pipeline` ([ff7f660](https://github.com/kubeflow/pipelines/commit/ff7f660c3c13e8e9f5f047ae4ee0dfbcebf6bfb8))
* **components:** Use larger base reward model when tuning `text` and `chat` variants of `bison@001` with the `preview.llm.rlhf_pipeline` ([ac39931](https://github.com/kubeflow/pipelines/commit/ac399315e66d6ed2666dc9dbaecbce4938f87356))
* **components:** use rlhf_preprocessor to replace the current value_exists call in rlhf ([c967d9f](https://github.com/kubeflow/pipelines/commit/c967d9f7df0bec5827cdf45ea02d3463d8b17aff))
* **kubernetes_platform:** Update kubernetes_platform go package to include generic ephemerl volume ([\#10602](https://github.com/kubeflow/pipelines/issues/10602)) ([2fc1492](https://github.com/kubeflow/pipelines/commit/2fc1492a0602be7f5aab94d246d4e0bc483de47a))
* **kubernetes_platform:** Update kubernetes_platform go package to include node affinities and pod (anti)affinities ([\#10583](https://github.com/kubeflow/pipelines/issues/10583)) ([4f8cae2](https://github.com/kubeflow/pipelines/commit/4f8cae2a633552d0a6fcc11a24e81fa5077a9fd2))
* **sdk+backend:** Add support for generic ephemeral volume ([\#10605](https://github.com/kubeflow/pipelines/issues/10605)) ([3fb76a8](https://github.com/kubeflow/pipelines/commit/3fb76a8e1590238abd1226ae961c5871bf41f5ef))


### Bug Fixes

* **backend:** Update backend common code and integration tests with updated API Service Params ([\#10640](https://github.com/kubeflow/pipelines/issues/10640)) ([8b2a099](https://github.com/kubeflow/pipelines/commit/8b2a099e8c9f216a139602be3d349f5b1aab9d2c))
* **Backend + SDK:** Add missing optional field to SecretAsVolume and … ([\#10550](https://github.com/kubeflow/pipelines/issues/10550)) ([a78dc77](https://github.com/kubeflow/pipelines/commit/a78dc77a301c9432f3e2791083b5d99266ae4e55))
* **components:** Ensure `preview.llm.rlhf_pipeline` runs if no `tensorboard_id` is provided ([ff0d0a7](https://github.com/kubeflow/pipelines/commit/ff0d0a7706123d427458e65d98b38d23975204c8))
* **components:** Fix image version parameter in rl pipelines ([cef6e51](https://github.com/kubeflow/pipelines/commit/cef6e510121e9956b9b78126a4f7565cf69b960a))
* **components:** Fix model eval import error in text generation/classification eval pipeline ([7630f85](https://github.com/kubeflow/pipelines/commit/7630f85031269abd8921eb6daed7cf65c19eeac4))
* **components:** Make AutoSxS autorater_prompt_parameters required ([df20088](https://github.com/kubeflow/pipelines/commit/df20088328353fd60e77f20dfc082b577381e5a0))
* **components:** remove default prediction column names in evaluation classification component to fix incorrect column names for bigquery data source ([54f2e45](https://github.com/kubeflow/pipelines/commit/54f2e45375999b2a57b3f7988a61b503dfd70834))
* **components:** Remove the unused functions from function_based ([e052dc8](https://github.com/kubeflow/pipelines/commit/e052dc8daf7c30f362a95ab6eec6a618ae7a9f70))
* **components:** Remove the unused generate_default_instruction and resolve_upload_location from function_based ([e9d8764](https://github.com/kubeflow/pipelines/commit/e9d8764f2066892027528e6bca8ced547f3457e0))
* **components:** Remove the unused resolve_data_paths from function_based ([c386913](https://github.com/kubeflow/pipelines/commit/c3869137d0e55f69f447d5d684a4a85bc7078166))
* **components:** Update service account comment ([bf444ac](https://github.com/kubeflow/pipelines/commit/bf444ac84b5cbee0ab364ae14c3174ee1d74723b))
* **metadata envoy:** upgrade envoy and config from 1.12 to 1.27 ([\#10589](https://github.com/kubeflow/pipelines/issues/10589)) ([96aaad9](https://github.com/kubeflow/pipelines/commit/96aaad9421a0449fa7634959f522964394fc26e9))


### Other Pull Requests

* No public description ([cab99f7](https://github.com/kubeflow/pipelines/commit/cab99f7443bc57abb296ee13ae9c79b4adad1ef5))
* No public description ([79d0a5c](https://github.com/kubeflow/pipelines/commit/79d0a5c4a8d45274d5d7753183cda8864176cdd4))
* Update loop_output.py example for the new parallel loop type requirement ([\#10637](https://github.com/kubeflow/pipelines/issues/10637)) ([afddae9](https://github.com/kubeflow/pipelines/commit/afddae993bb367815f51de45c4dd8e5516e9ac1b))

## [2.1.0](https://github.com/kubeflow/pipelines/compare/2.0.5...2.1.0) (2024-03-25)


Expand Down
4 changes: 2 additions & 2 deletions RELEASE.md
Original file line number Diff line number Diff line change
Expand Up @@ -237,15 +237,15 @@ and then "Retry", because after waiting for previous step, artifacts are now rea
cd backend/api/v2beta1/python_http_client
rm -r dist
python3 setup.py --quiet sdist
python3 -m twine upload --username kubeflow-pipelines dist/*
python3 -m twine upload dist/*
```
1. Release `kfp` python packages to PyPI. (Note: Please skip this step for backend release, this step will be handled by SDK release.)
```bash
pip3 install twine --user
gsutil cp gs://ml-pipeline/release/$VERSION/kfp.tar.gz kfp-$VERSION.tar.gz
python3 -m twine upload --username kubeflow-pipelines kfp-$VERSION.tar.gz
python3 -m twine upload kfp-$VERSION.tar.gz
```
!!! The file name must contain the version. See <https://github.com/kubeflow/pipelines/issues/1292>
Expand Down
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
2.1.0
2.2.0
4 changes: 2 additions & 2 deletions backend/api/v1beta1/python_http_client/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ This file contains REST API specification for Kubeflow Pipelines. The file is au

This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:

- API version: 2.1.0
- Package version: 2.1.0
- API version: 2.2.0
- Package version: 2.2.0
- Build package: org.openapitools.codegen.languages.PythonClientCodegen
For more information, please visit [https://www.google.com](https://www.google.com)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

from __future__ import absolute_import

__version__ = "2.1.0"
__version__ = "2.2.0"

# import apis into sdk package
from kfp_server_api.api.experiment_service_api import ExperimentServiceApi
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ def __init__(self, configuration=None, header_name=None, header_value=None,
self.default_headers[header_name] = header_value
self.cookie = cookie
# Set default User-Agent.
self.user_agent = 'OpenAPI-Generator/2.1.0/python'
self.user_agent = 'OpenAPI-Generator/2.2.0/python'
self.client_side_validation = configuration.client_side_validation

def __enter__(self):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -351,8 +351,8 @@ def to_debug_report(self):
return "Python SDK Debug Report:\n"\
"OS: {env}\n"\
"Python Version: {pyversion}\n"\
"Version of the API: 2.1.0\n"\
"SDK Package Version: 2.1.0".\
"Version of the API: 2.2.0\n"\
"SDK Package Version: 2.2.0".\
format(env=sys.platform, pyversion=sys.version)

def get_host_settings(self):
Expand Down
2 changes: 1 addition & 1 deletion backend/api/v1beta1/python_http_client/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from setuptools import setup, find_packages # noqa: H301

NAME = "kfp-server-api"
VERSION = "2.1.0"
VERSION = "2.2.0"
# To install the library, run the following
#
# python setup.py install
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"swagger": "2.0",
"info": {
"title": "Kubeflow Pipelines API",
"version": "2.1.0",
"version": "2.2.0",
"description": "This file contains REST API specification for Kubeflow Pipelines. The file is autogenerated from the swagger definition.",
"contact": {
"name": "google",
Expand Down
4 changes: 2 additions & 2 deletions backend/api/v2beta1/python_http_client/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ This file contains REST API specification for Kubeflow Pipelines. The file is au

This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:

- API version: 2.1.0
- Package version: 2.1.0
- API version: 2.2.0
- Package version: 2.2.0
- Build package: org.openapitools.codegen.languages.PythonClientCodegen
For more information, please visit [https://www.google.com](https://www.google.com)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

from __future__ import absolute_import

__version__ = "2.1.0"
__version__ = "2.2.0"

# import apis into sdk package
from kfp_server_api.api.auth_service_api import AuthServiceApi
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ def __init__(self, configuration=None, header_name=None, header_value=None,
self.default_headers[header_name] = header_value
self.cookie = cookie
# Set default User-Agent.
self.user_agent = 'OpenAPI-Generator/2.1.0/python'
self.user_agent = 'OpenAPI-Generator/2.2.0/python'
self.client_side_validation = configuration.client_side_validation

def __enter__(self):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -351,8 +351,8 @@ def to_debug_report(self):
return "Python SDK Debug Report:\n"\
"OS: {env}\n"\
"Python Version: {pyversion}\n"\
"Version of the API: 2.1.0\n"\
"SDK Package Version: 2.1.0".\
"Version of the API: 2.2.0\n"\
"SDK Package Version: 2.2.0".\
format(env=sys.platform, pyversion=sys.version)

def get_host_settings(self):
Expand Down
2 changes: 1 addition & 1 deletion backend/api/v2beta1/python_http_client/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from setuptools import setup, find_packages # noqa: H301

NAME = "kfp-server-api"
VERSION = "2.1.0"
VERSION = "2.2.0"
# To install the library, run the following
#
# python setup.py install
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"swagger": "2.0",
"info": {
"title": "Kubeflow Pipelines API",
"version": "2.1.0",
"version": "2.2.0",
"description": "This file contains REST API specification for Kubeflow Pipelines. The file is autogenerated from the swagger definition.",
"contact": {
"name": "google",
Expand Down
4 changes: 2 additions & 2 deletions backend/src/v2/compiler/argocompiler/container.go
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ import (

const (
volumeNameKFPLauncher = "kfp-launcher"
DefaultLauncherImage = "gcr.io/ml-pipeline/kfp-launcher@sha256:c639c51cf19749922fe3f750968e7e32c2a418c73e30ddfd7162ba1a16bad0d0"
DefaultLauncherImage = "gcr.io/ml-pipeline/kfp-launcher@sha256:8fe5e6e4718f20b021736022ad3741ddf2abd82aa58c86ae13e89736fdc3f08f"
LauncherImageEnvVar = "V2_LAUNCHER_IMAGE"
DefaultDriverImage = "gcr.io/ml-pipeline/kfp-driver@sha256:f308b24f51df1165592563b1892fad50f9faaaf314b4ac0638e37aeee3aa8f2c"
DefaultDriverImage = "gcr.io/ml-pipeline/kfp-driver@sha256:3c0665cd36aa87e4359a4c8b6271dcba5bdd817815cd0496ed12eb5dde5fd2ec"
DriverImageEnvVar = "V2_DRIVER_IMAGE"
)

Expand Down
3 changes: 3 additions & 0 deletions components/google-cloud/RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
## Upcoming release

## Release 2.15.0
* Add input parameter `autorater_prompt_parameters` to `_implementation.llm.online_evaluation_pairwise` component.

## Release 2.14.0
* Use larger base reward model when tuning `text-bison@001`, `chat-bison@001` and `t5-xxl` with the `preview.llm.rlhf_pipeline`.
* Move `preview.model_evaluation.autosxs_pipeline` to `v1.model_evaluation.autosxs_pipeline`.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ def pipeline(
deploy_model: bool = True,
encryption_spec_key_name: str = '',
upload_location: str = _placeholders.LOCATION_PLACEHOLDER,
regional_endpoint: str = '',
) -> PipelineOutput:
# fmt: off
"""Uploads a tuned language model and (optionally) deploys it to an endpoint.
Expand All @@ -51,16 +52,13 @@ def pipeline(
deploy_model: Whether to deploy the model to an endpoint in `us-central1`. Default is True.
encryption_spec_key_name: Customer-managed encryption key. If this is set, then all resources created by the CustomJob will be encrypted with the provided encryption key. Note that this is not supported for TPU at the moment.
upload_location: Region to upload and deploy the model to. Default is the location used to run the pipeline components.
regional_endpoint: Regional endpoint to upload the model.
Returns:
model_resource_name: Path to the model uploaded to the Model Registry. This will be an empty string if the model was not deployed.
endpoint_resource_name: Path the Online Prediction Endpoint. This will be an empty string if the model was not deployed.
"""
# fmt: on
regional_endpoint = function_based.resolve_regional_endpoint(
upload_location=upload_location
).set_display_name('Resolve Regional Endpoint')

display_name = (
function_based.resolve_model_display_name(
large_model_reference=large_model_reference,
Expand All @@ -76,7 +74,7 @@ def pipeline(
upload_task = upload_llm_model.refined_upload_llm_model(
project=_placeholders.PROJECT_ID_PLACEHOLDER,
location=upload_location,
regional_endpoint=regional_endpoint.output,
regional_endpoint=regional_endpoint,
artifact_uri=output_adapter_path,
model_display_name=display_name.output,
model_reference_name=large_model_reference,
Expand All @@ -93,7 +91,7 @@ def pipeline(
location=upload_location,
model_resource_name=upload_task.outputs['model_resource_name'],
display_name=display_name.output,
regional_endpoint=regional_endpoint.output,
regional_endpoint=regional_endpoint,
deploy_model=deploy_model.output,
encryption_spec_key_name=encryption_spec_key_name,
).set_display_name('Deploy Model')
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@
DO NOT EDIT - This file is generated, manual changes will be overridden.
"""

IMAGE_TAG = '20240425_1734_RC00'
IMAGE_TAG = '20240430_1158_RC00'
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# Copyright 2024 The Kubeflow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Component that preprocesses inputs for infer pipeline."""

from google_cloud_pipeline_components import _placeholders
from google_cloud_pipeline_components import utils as gcpc_utils
from google_cloud_pipeline_components._implementation.llm import utils
from kfp import dsl


@dsl.container_component
def infer_preprocessor(
gcp_resources: dsl.OutputPath(str), # pytype: disable=invalid-annotation
image_uri: str = utils.get_default_image_uri('refined_cpu', ''),
) -> dsl.ContainerSpec: # pylint: disable=g-doc-args
# fmt: off
"""Preprocess infer pipeline inputs.
Args:
app_name: The preprocessor app name.
Returns:
gcp_resources: GCP resources that can be used to track the custom job.
"""
# fmt: on
return gcpc_utils.build_serverless_customjob_container_spec(
project=_placeholders.PROJECT_ID_PLACEHOLDER,
location=_placeholders.LOCATION_PLACEHOLDER,
custom_job_payload=utils.build_payload(
display_name='infer_preprocessor',
machine_type='n1-standard-4',
image_uri=image_uri,
args=[
'--app_name=infer_preprocessor',
],
),
gcp_resources=gcp_resources,
)
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ def online_evaluation_pairwise(
project: str = _placeholders.PROJECT_ID_PLACEHOLDER,
location: str = _placeholders.LOCATION_PLACEHOLDER,
encryption_spec_key_name: str = '',
autorater_prompt_parameters: Dict[str, Dict[str, str]] = {},
) -> dsl.ContainerSpec: # pylint: disable=g-doc-args
"""Evaluate two models using an autorater.
Expand All @@ -73,6 +74,8 @@ def online_evaluation_pairwise(
encryption_spec_key_name: Customer-managed encryption key options. If this
is set, then all resources created by the component will be encrypted with
the provided encryption key.
autorater_prompt_parameters: Map of autorater prompt template parameters to
columns or templates.
Returns:
judgments: Individual judgments used to calculate the win rates.
Expand Down Expand Up @@ -112,6 +115,11 @@ def online_evaluation_pairwise(
'--executor_input={{$.json_escape[1]}}',
f'--kms_key_name={encryption_spec_key_name}',
f'--metadata_path={metadata}',
(
'--autorater_prompt_parameters='
"{{$.inputs.parameters['autorater_prompt_parameters']"
'.json_escape[0]}}'
),
],
encryption_spec_key_name=encryption_spec_key_name,
),
Expand Down
Loading

0 comments on commit 0b5e3a3

Please sign in to comment.