Skip to content

Commit

Permalink
Prepare 3.4.1 release
Browse files Browse the repository at this point in the history
  • Loading branch information
jtgrabowski committed Aug 21, 2023
1 parent 5a25f7f commit 1d51528
Show file tree
Hide file tree
Showing 5 changed files with 16 additions and 13 deletions.
3 changes: 3 additions & 0 deletions CHANGES.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
3.4.1
* Scala 2.13 support (SPARKC-686)

3.4.0
* Spark 3.4.x support (SPARKC-702)
* Fix complex field extractor after join on CassandraDirectJoinStrategy (SPARKC-700)
Expand Down
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@
| What | Where |
| ---------- |---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Community | Chat with us at [Apache Cassandra](https://cassandra.apache.org/_/community.html#discussions) |
| Scala Docs | Most Recent Release (3.4.0): [Connector API docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.0/connector/com/datastax/spark/connector/index.html), [Connector Driver docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.0/driver/com/datastax/spark/connector/index.html) |
| Latest Production Release | [3.4.0](https://search.maven.org/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.4.0/jar) |
| Scala Docs | Most Recent Release (3.4.1): [Connector API docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.1/connector/com/datastax/spark/connector/index.html), [Connector Driver docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.1/driver/com/datastax/spark/connector/index.html) |
| Latest Production Release | [3.4.1](https://search.maven.org/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.4.1/jar) |

## Features

Expand Down Expand Up @@ -53,8 +53,8 @@ Currently, the following branches are actively supported:
2.5.x ([b2.5](https://github.com/datastax/spark-cassandra-connector/tree/b2.5)).

| Connector | Spark | Cassandra | Cassandra Java Driver | Minimum Java Version | Supported Scala Versions |
|-----------|---------------|-----------------------| --------------------- | -------------------- | ----------------------- |
| 3.4 | 3.4 | 2.1.5*, 2.2, 3.x, 4.x | 4.13 | 8 | 2.12 |
|-----------|---------------|-----------------------| --------------------- | -------------------- |--------------------------|
| 3.4 | 3.4 | 2.1.5*, 2.2, 3.x, 4.x | 4.13 | 8 | 2.12, 2.13 |
| 3.3 | 3.3 | 2.1.5*, 2.2, 3.x, 4.x | 4.13 | 8 | 2.12 |
| 3.2 | 3.2 | 2.1.5*, 2.2, 3.x, 4.0 | 4.13 | 8 | 2.12 |
| 3.1 | 3.1 | 2.1.5*, 2.2, 3.x, 4.0 | 4.12 | 8 | 2.12 |
Expand All @@ -77,8 +77,8 @@ Currently, the following branches are actively supported:
## Hosted API Docs
API documentation for the Scala and Java interfaces are available online:

### 3.4.0
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.0/connector/com/datastax/spark/connector/index.html)
### 3.4.1
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.1/connector/com/datastax/spark/connector/index.html)

### 3.3.0
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.3.0/connector/com/datastax/spark/connector/index.html)
Expand All @@ -105,7 +105,7 @@ This project is available on the Maven Central Repository.
For SBT to download the connector binaries, sources and javadoc, put this in your project
SBT config:

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "3.4.0"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "3.4.1"

* The default Scala version for Spark 3.0+ is 2.12 please choose the appropriate build. See the
[FAQ](doc/FAQ.md) for more information.
Expand Down
8 changes: 4 additions & 4 deletions doc/0_quick_start.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@ Configure a new Scala project with the Apache Spark and dependency.

The dependencies are easily retrieved via Maven Central

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.4.0"
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.4.1"

The spark-packages libraries can also be used with spark-submit and spark shell, these
commands will place the connector and all of its dependencies on the path of the
Spark Driver and all Spark Executors.

$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0
$SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0
$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1
$SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1

For the list of available versions, see:
- https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.12/
Expand All @@ -42,7 +42,7 @@ and *all* of its dependencies on the Spark Class PathTo configure
the default Spark Configuration pass key value pairs with `--conf`

$SPARK_HOME/bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1
--conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions

This command would set the Spark Cassandra Connector parameter
Expand Down
2 changes: 1 addition & 1 deletion doc/13_spark_shell.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Find additional versions at [Spark Packages](https://repo1.maven.org/maven2/com/
```bash
cd spark/install/dir
#Include the --master if you want to run against a spark cluster and not local mode
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0 --conf spark.cassandra.connection.host=yourCassandraClusterIp
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1 --conf spark.cassandra.connection.host=yourCassandraClusterIp
```

By default spark will log everything to the console and this may be a bit of an overload. To change this copy and modify the `log4j.properties` template file
Expand Down
2 changes: 1 addition & 1 deletion doc/15_python.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ shell similarly to how the spark shell is started. The preferred method is now t

```bash
./bin/pyspark \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0 \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1 \
--conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions
```

Expand Down

0 comments on commit 1d51528

Please sign in to comment.