Skip to content

Commit

Permalink
feat(spark): update to 3.2.1 (#474)
Browse files Browse the repository at this point in the history
  • Loading branch information
amirhalatzi authored Feb 25, 2022
1 parent 5522f3c commit e54a891
Show file tree
Hide file tree
Showing 5 changed files with 8 additions and 8 deletions.
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ env:
- SPARK2_HADOOP_VERSION=2.9.2
- SPARK2_HIVE_VERSION=2.3.3
- SPARK2_VERSION=2.4.6
- SPARK_VERSION=3.2.0
- SPARK_VERSION=3.2.1
- HIVE_VERSION=2.3.7
- HUDI_VERSION=0.10.0
- TARGET_CACHE=$HOME/target-cache/${TRAVIS_COMMIT}
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -225,7 +225,7 @@ Kafka output allows writing batch operations to kafka

We use spark-sql-kafka-0-10 as a provided jar - spark-submit command should look like so:

```spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.2.0 --class com.yotpo.metorikku.Metorikku metorikku.jar```
```spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.2.1 --class com.yotpo.metorikku.Metorikku metorikku.jar```

##### Mandatory parameters:
* **topic** - defines the topic in kafka which the data will be written to.
Expand Down Expand Up @@ -288,7 +288,7 @@ This will commit the offsets to kafka, as a new dummy consumer group.
* we use ABRiS as a provided jar In order to deserialize your kafka stream messages (https://github.com/AbsaOSS/ABRiS), add the ```schemaRegistryUrl``` option to the kafka input config
spark-submit command should look like so:

```spark-submit --repositories http://packages.confluent.io/maven/ --jars https://repo1.maven.org/maven2/za/co/absa/abris_2.12/3.2.0/abris_2.12-3.2.0.jar --packages org.apache.spark:spark-avro_2.12:3.2.0,org.apache.spark:spark-sql-kafka-0-10_2.12:3.2.0,io.confluent:kafka-schema-registry-client:5.3.0,io.confluent:kafka-avro-serializer:5.3.0 --class com.yotpo.metorikku.Metorikku metorikku.jar```
```spark-submit --repositories http://packages.confluent.io/maven/ --jars https://repo1.maven.org/maven2/za/co/absa/abris_2.12/3.2.1/abris_2.12-3.2.1.jar --packages org.apache.spark:spark-avro_2.12:3.2.0,org.apache.spark:spark-sql-kafka-0-10_2.12:3.2.0,io.confluent:kafka-schema-registry-client:5.3.0,io.confluent:kafka-avro-serializer:5.3.0 --class com.yotpo.metorikku.Metorikku metorikku.jar```

* If your subject schema name is not ```<TOPIC NAME>-value``` (e.g. if the topic is a regex pattern) you can specify the schema subject in the ```schemaSubject``` section

Expand Down
4 changes: 2 additions & 2 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ scalaVersion := Option(System.getenv("SCALA_VERSION")).getOrElse("2.12.15")

val sparkVersion: Def.Initialize[String] = Def.setting {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, scalaMajor)) if scalaMajor >= 12 => Option(System.getenv("SPARK_VERSION")).getOrElse("3.2.0")
case Some((2, scalaMajor)) if scalaMajor >= 12 => Option(System.getenv("SPARK_VERSION")).getOrElse("3.2.1")
case _ => Option(System.getenv("SPARK2_VERSION")).getOrElse("2.4.6")
}
}
Expand Down Expand Up @@ -44,7 +44,7 @@ val parquetVersion: Def.Initialize[String] = Def.setting {

val deequVersion: Def.Initialize[String] = Def.setting {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, scalaMajor)) if scalaMajor >= 12 => "2.0.0-spark-3.1"
case Some((2, scalaMajor)) if scalaMajor >= 12 => "2.0.1-spark-3.2"
case _ => "1.1.0_spark-2.4-scala-2.11"
}
}
Expand Down
4 changes: 2 additions & 2 deletions docker/spark/k8s/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
ARG SPARK_VERSION=3.2.0
ARG SPARK_VERSION=3.2.1
FROM metorikku/spark:base-${SPARK_VERSION}

ARG AWS_SDK_VERSION=1.11.901
ARG HADOOP_VERSION=3.3.1
ARG HTTPCLIENT_VERSION=4.5.11
ARG SCALA_MAJOR_VERSION=2.12
ARG SPARK_VERSION=3.2.0
ARG SPARK_VERSION=3.2.1
ARG METRICS_FILE=metrics_spark3.properties

USER root
Expand Down
2 changes: 1 addition & 1 deletion examples/udf/build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ scalaVersion := Option(System.getProperty("scalaVersion")).getOrElse("2.12.15")

val sparkVersion: Def.Initialize[String] = Def.setting {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, scalaMajor)) if scalaMajor >= 12 => Option(System.getProperty("sparkVersion")).getOrElse("3.2.0")
case Some((2, scalaMajor)) if scalaMajor >= 12 => Option(System.getProperty("sparkVersion")).getOrElse("3.2.1")
case _ => "2.4.6"
}
}
Expand Down

0 comments on commit e54a891

Please sign in to comment.