Skip to content

Commit

Permalink
Releasing 2.7.0
Browse files Browse the repository at this point in the history
  • Loading branch information
EnricoMi committed May 5, 2023
1 parent 6204c03 commit 8a59a8a
Show file tree
Hide file tree
Showing 5 changed files with 17 additions and 17 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
## [UNRELEASED] - YYYY-MM-DD
## [2.7.0] - 2023-05-05

### Added

Expand Down
18 changes: 9 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ The package version has the following semantics: `spark-extension_{SCALA_COMPAT_
Add this line to your `build.sbt` file:

```sbt
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.6.0-3.4"
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.7.0-3.4"
```

### Maven
Expand All @@ -75,7 +75,7 @@ Add this dependency to your `pom.xml` file:
<dependency>
<groupId>uk.co.gresearch.spark</groupId>
<artifactId>spark-extension_2.12</artifactId>
<version>2.6.0-3.4</version>
<version>2.7.0-3.4</version>
</dependency>
```

Expand All @@ -84,7 +84,7 @@ Add this dependency to your `pom.xml` file:
Submit your Spark app with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.3 [jar]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3 [jar]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version.
Expand All @@ -94,7 +94,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
Launch a Spark Shell with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.4
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your Spark Shell version.
Expand All @@ -110,7 +110,7 @@ from pyspark.sql import SparkSession

spark = SparkSession \
.builder \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.4") \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4") \
.getOrCreate()
```

Expand All @@ -121,7 +121,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depe
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.4
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your PySpark version.
Expand All @@ -131,7 +131,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depe
Run your Python scripts that use PySpark via `spark-submit`:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.4 [script.py]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4 [script.py]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your Spark version.
Expand All @@ -145,7 +145,7 @@ Running your Python application on a Spark cluster will still require one of the
to add the Scala package to the Spark environment.

```shell script
pip install pyspark-extension==2.6.0.3.4
pip install pyspark-extension==2.7.0.3.4
```

Note: Pick the right Spark version (here 3.4) depending on your PySpark version.
Expand All @@ -155,7 +155,7 @@ Note: Pick the right Spark version (here 3.4) depending on your PySpark version.
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.4
uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.4

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<modelVersion>4.0.0</modelVersion>
<groupId>uk.co.gresearch.spark</groupId>
<artifactId>spark-extension_2.13</artifactId>
<version>2.7.0-3.4-SNAPSHOT</version>
<version>2.7.0-3.4</version>
<name>Spark Extension</name>
<description>A library that provides useful extensions to Apache Spark.</description>
<inceptionYear>2020</inceptionYear>
Expand Down
10 changes: 5 additions & 5 deletions python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ Running your Python application on a Spark cluster will still require one of the
to add the Scala package to the Spark environment.

```shell script
pip install pyspark-extension==2.6.0.3.3
pip install pyspark-extension==2.7.0.3.3
```

Note: Pick the right Spark version (here 3.3) depending on your PySpark version.
Expand All @@ -40,7 +40,7 @@ from pyspark.sql import SparkSession

spark = SparkSession \
.builder \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.3") \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3") \
.getOrCreate()
```

Expand All @@ -51,7 +51,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.3
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your PySpark version.
Expand All @@ -61,7 +61,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
Run your Python scripts that use PySpark via `spark-submit`:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.3 [script.py]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3 [script.py]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version.
Expand All @@ -71,7 +71,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depe
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.6.0-3.3
uk.co.gresearch.spark:spark-extension_2.12:2.7.0-3.3

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.
Expand Down
2 changes: 1 addition & 1 deletion python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
from pathlib import Path
from setuptools import setup

jar_version = '2.7.0-3.4-SNAPSHOT'
jar_version = '2.7.0-3.4'
scala_version = '2.13.8'
scala_compat_version = '.'.join(scala_version.split('.')[:2])
spark_compat_version = jar_version.split('-')[1]
Expand Down

0 comments on commit 8a59a8a

Please sign in to comment.