diff --git a/doc/0_quick_start.md b/doc/0_quick_start.md index 8172568b6..97b9a5940 100644 --- a/doc/0_quick_start.md +++ b/doc/0_quick_start.md @@ -15,14 +15,14 @@ Configure a new Scala project with the Apache Spark and dependency. The dependencies are easily retrieved via Maven Central - libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.5.0" + libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.5.1" The spark-packages libraries can also be used with spark-submit and spark shell, these commands will place the connector and all of its dependencies on the path of the Spark Driver and all Spark Executors. - $SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.0 - $SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.0 + $SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.1 + $SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.1 For the list of available versions, see: - https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.12/ @@ -42,7 +42,7 @@ and *all* of its dependencies on the Spark Class PathTo configure the default Spark Configuration pass key value pairs with `--conf` $SPARK_HOME/bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 \ - --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.0 + --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.1 --conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions This command would set the Spark Cassandra Connector parameter diff --git a/doc/13_spark_shell.md b/doc/13_spark_shell.md index e7962306c..f4b23df69 100644 --- a/doc/13_spark_shell.md +++ b/doc/13_spark_shell.md @@ -18,7 +18,7 @@ Find additional versions at [Spark Packages](https://repo1.maven.org/maven2/com/ ```bash cd spark/install/dir #Include the --master if you want to run against a spark cluster and not local mode -./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.0 --conf spark.cassandra.connection.host=yourCassandraClusterIp +./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.1 --conf spark.cassandra.connection.host=yourCassandraClusterIp ``` By default spark will log everything to the console and this may be a bit of an overload. To change this copy and modify the `log4j.properties` template file diff --git a/doc/15_python.md b/doc/15_python.md index 7abf97e51..d6d8c5f26 100644 --- a/doc/15_python.md +++ b/doc/15_python.md @@ -14,7 +14,7 @@ shell similarly to how the spark shell is started. The preferred method is now t ```bash ./bin/pyspark \ - --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.0 \ + --packages com.datastax.spark:spark-cassandra-connector_2.12:3.5.1 \ --conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions ```