Skip to content

Commit

Permalink
Bump version
Browse files Browse the repository at this point in the history
  • Loading branch information
dolfinus committed Aug 29, 2024
1 parent a7d4f40 commit cf1dca4
Show file tree
Hide file tree
Showing 12 changed files with 55 additions and 26 deletions.
53 changes: 53 additions & 0 deletions docs/changelog/0.12.0.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
0.12.0 (2024-08-29)
===================

Breaking Changes
----------------

- Change connection URL used for generating HWM names of S3 and Samba sources:
* ``smb://host:port`` -> ``smb://host:port/share``
* ``s3://host:port`` -> ``s3://host:port/bucket`` (:github:pull:`304`)

- Update ``Excel`` package from ``0.20.3`` to ``0.20.4``, to include Spark 3.5.1 support. (:github:pull:`306`)

Features
--------

- Add support for specifying file formats (``ORC``, ``Parquet``, ``CSV``, etc.) in ``HiveWriteOptions.format`` (:github:pull:`292`):

.. code:: python
Hive.WriteOptions(format=ORC(compression="snappy"))
- Collect Spark execution metrics in following methods, and log then in DEBUG mode:
* ``DBWriter.run()``
* ``FileDFWriter.run()``
* ``Hive.sql()``
* ``Hive.execute()``

This is implemented using custom ``SparkListener`` which wraps the entire method call, and
then report collected metrics. But these metrics sometimes may be missing due to Spark architecture,
so they are not reliable source of information. That's why logs are printed only in DEBUG mode, and
are not returned as method call result. (:github:pull:`303`)

- Generate default ``jobDescription`` based on currently executed method. Examples:
* ``DBWriter() -> Postgres[host:5432/database]``
* ``MongoDB[localhost:27017/admin] -> DBReader.run()``
* ``Hive[cluster].execute()``

If user already set custom ``jobDescription``, it will left intact. (:github:pull:`304`)

- Add log.info about JDBC dialect usage (:github:pull:`305`):

.. code:: text
|MySQL| Detected dialect: 'org.apache.spark.sql.jdbc.MySQLDialect'
- Log estimated size of in-memory dataframe created by ``JDBC.fetch`` and ``JDBC.execute`` methods. (:github:pull:`303`)


Bug Fixes
---------

- Fix passing ``Greenplum(extra={"options": ...)`` during read/write operations. (:github:pull:`308`)
- Do not raise exception if yield-based hook whas something past (and only one) ``yield``.
1 change: 1 addition & 0 deletions docs/changelog/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
:caption: Changelog

DRAFT
0.12.0
0.11.1
0.11.0
0.10.2
Expand Down
1 change: 0 additions & 1 deletion docs/changelog/next_release/+yield.feature.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/changelog/next_release/292.feature.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/changelog/next_release/303.feature.1.rst

This file was deleted.

10 changes: 0 additions & 10 deletions docs/changelog/next_release/303.feature.2.rst

This file was deleted.

3 changes: 0 additions & 3 deletions docs/changelog/next_release/304.breaking.rst

This file was deleted.

6 changes: 0 additions & 6 deletions docs/changelog/next_release/304.feature.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/changelog/next_release/305.feature.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/changelog/next_release/306.feature.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/changelog/next_release/308.bugfix.rst

This file was deleted.

2 changes: 1 addition & 1 deletion onetl/VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.11.2
0.12.0

0 comments on commit cf1dca4

Please sign in to comment.