Releases: Open-EO/openeo-python-client
Releases · Open-EO/openeo-python-client
openEO Python Client v0.37.0
Added
- Added
show_error_logs
argument tocube.execute_batch()
/job.start_and_wait()
/... to toggle the automatic printing of error logs on failure (#505) - Added
Connection.web_editor()
to build link to the openEO backend in the openEO Web Editor - Add support for
log_level
increate_job()
andexecute_job()
(#704) - Add initial support for "geometry" dimension type in
CubeMetadata
(#705) - Add support for parameterized
bands
argument inload_stac()
- Argument
spatial_extent
inload_collection()
/load_stac()
: add support for Shapely objects, loading GeoJSON from a local path and loading geometry from GeoJSON/GeoParquet URL. (#678)
Changed
- Raise exception when providing empty bands array to
load_collection
/load_stac
(#424, Open-EO/openeo-processes#372) - Start showing deprecation warnings on usage of GeoJSON "GeometryCollection" (in
filter_spatial
,aggregate_spatial
,chunk_polygon
,mask_polygon
). Use a GeoJSON FeatureCollection instead. (#706, Open-EO/openeo-processes#389) - The
context
parameter is now used inexecute_local_udf
(#556
Fixed
- Clear capabilities cache on login (#254)
openEO Python Client v0.36.0
Added
- Automatically use
load_url
when providing a URL as geometries toDataCube.aggregate_spatial()
,DataCube.mask_polygon()
, etc. (#104, #457) - Allow specifying
limit
when listing batch jobs withConnection.list_jobs()
(#677) - Add
additional
andjob_options
arguments toConnection.download()
,Datacube.download()
and related (#681)
Changed
MultiBackendJobManager
: costs has been added as a column in tracking databases ([#588])- When passing a path/string as
geometry
toDataCube.aggregate_spatial()
,DataCube.mask_polygon()
, etc.:
this is not translated automatically anymore to deprecated, non-standardread_vector
usage.
Instead, if it is a local GeoJSON file, the GeoJSON data will be loaded directly client-side.
(#104, #457) - Move
read()
method from generalJobDatabaseInterface
to more specificFullDataFrameJobDatabase
(#680) - Align
additional
andjob_options
arguments inConnection.create_job()
,DataCube.create_job()
and related.
Also, follow official spec more closely. (#683, Open-EO/openeo-api#276)
Fixed
openEO Python Client v0.35.0
Added
- Added
MultiResult
helper class to build process graphs with multiple result nodes (#391)
Fixed
MultiBackendJobManager
: Fix issue with duplicate job starting across multiple backends (#654)MultiBackendJobManager
: Fix encoding issue of job metadata inon_job_done
(#657)MultiBackendJobManager
: AvoidSettingWithCopyWarning
(#641)- Avoid creating empty file if asset download request failed.
MultiBackendJobManager
: avoid dtype loading mistakes inCsvJobDatabase
on empty columns (#656)MultiBackendJobManager
: restore logging of job status histogram duringrun_jobs
(#655)
openEO Python Client v0.34.0
openEO Python Client v0.33.0
Added
- Added
DataCube.load_stac()
to also support creating aload_stac
based cube without a connection (#638) MultiBackendJobManager
: Addedinitialize_from_df(df)
(toCsvJobDatabase
andParquetJobDatabase
) to initialize (and persist) the job database from a given DataFrame.
Also addedcreate_job_db()
factory to easily create a job database from a given dataframe and its type guessed from filename extension.
(#635)MultiBackendJobManager.run_jobs()
now returns a dictionary with counters/stats about various events during the full run of the job manager (#645)- Added (experimental)
ProcessBasedJobCreator
to be used asstart_job
callable withMultiBackendJobManager
to create multiple jobs from a single parameterized process (e.g. a UDP or remote process definition) (#604)
Fixed
- When using
DataCube.load_collection()
without a connection, it is not necessary anymore to also explicitly setfetch_metadata=False
(#638)
openEO Python Client v0.32.0
Added
load_stac
/metadata_from_stac
: add support for extracting actual temporal dimension metadata (#567)MultiBackendJobManager
: addcancel_running_job_after
option to automatically cancel jobs that are running for too long (#590)- Added
openeo.api.process.Parameter
helper to easily create a "spatial_extent" UDP parameter - Wrap OIDC token request failure in more descriptive
OidcException
(related to #624) - Added
auto_add_save_result
option (on by default) to disable automatic addition ofsave_result
node ondownload
/create_job
/execute_batch
(#513) - Add support for
apply_vectorcube
UDF signature inrun_udf_code
([Open-EO/openeo-geopyspark-driver#881]Open-EO/openeo-geopyspark-driver#811) MultiBackendJobManager
: add API to the update loop in a separate thread, allowing controlled interruption.
Changed
MultiBackendJobManager
: changed job metadata storage API, to enable working with large databasesDataCube.apply_polygon()
: renamepolygons
argument togeometries
, but keep support for legacypolygons
for now (#592, #511)- Disallow ambiguous single string argument in
DataCube.filter_temporal()
(#628) - Automatic adding of
save_result
fromdownload()
orcreate_job()
: inspect whole process graph for pre-existingsave_result
nodes (related to #623, #401, #583) - Disallow ambiguity of combining explicit
save_result
nodes and implicitsave_result
addition fromdownload()
/create_job()
calls withformat
(related to #623, #401, #583)
Fixed
apply_dimension
with atarget_dimension
argument was not correctly adjusting datacube metadata on the client side, causing a mismatch.- Preserve non-spatial dimension metadata in
aggregate_spatial
(#612)
openEO Python Client v0.31.0
Added
- Add experimental
openeo.testing.results
subpackage with reusable test utilities for comparing batch job results with reference data MultiBackendJobManager
: add initial support for storing job metadata in Parquet file (instead of CSV) (#571)- Add
Connection.authenticate_oidc_access_token()
to set up authorization headers with an access token that is obtained "out-of-band" (#598) - Add
JobDatabaseInterface
to allow custom job metadata storage withMultiBackendJobManager
(#571)
openEO Python Client v0.30.0
Added
- Add
openeo.udf.run_code.extract_udf_dependencies()
to extract UDF dependency declarations from UDF code
(related to Open-EO/openeo-geopyspark-driver#237) - Document PEP 723 based Python UDF dependency declarations (Open-EO/openeo-geopyspark-driver#237)
- Added more
openeo.api.process.Parameter
helpers to easily create "bounding_box", "date", "datetime", "geojson" and "temporal_interval" parameters for UDP construction. - Added convenience method
Connection.load_stac_from_job(job)
to easily load the results of a batch job with theload_stac
process (#566) load_stac
/metadata_from_stac
: add support for extracting band info from "item_assets" in collection metadata (#573)- Added initial
openeo.testing
submodule for reusable test utilities
Fixed
- Initial fix for broken
DataCube.reduce_temporal()
afterload_stac
(#568)
openEO Python Client v0.29.0
openEO Python Client v0.28.0
Added
- Introduced superclass
CubeMetadata
forCollectionMetadata
for essential metadata handling (just dimensions for now) without collection-specific STAC metadata parsing. (#464) - Added
VectorCube.vector_to_raster()
(#550)
Changed
- Changed default
chunk_size
of variousdownload
functions from None to 10MB. This improves the handling of large downloads and reduces memory usage. (#528) Connection.execute()
andDataCube.execute()
now have aauto_decode
argument. If set to True (default) the response will be decoded as a JSON and throw an exception if this fails, if set to False the rawrequests.Response
object will be returned. (#499)
Fixed
- Preserve geo-referenced
x
andy
coordinates inexecute_local_udf
(#549)