Skip to content

Commit

Permalink
Use --driver-memory instead of --conf spark.driver.memory
Browse files Browse the repository at this point in the history
According to Spark documentation, the configuration property `spark.driver.memory` has no effect in our case (we use the “client” deploy-mode):

>  In client mode, this config must not be set through the SparkConf directly in your application, because the driver JVM has already started at that point. Instead, please set this through the --driver-memory command line option or in your default properties file.

https://spark.apache.org/docs/latest/configuration.html
  • Loading branch information
julienrf committed Aug 27, 2024
1 parent e01c75b commit 84f8e9b
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion ansible/templates/submit-alternator-job.sh
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,5 @@ time spark-submit --class com.scylladb.migrator.Migrator \
--conf spark.scylla.config=/home/ubuntu/scylla-migrator/config.dynamodb.yml \
--executor-memory $EXECUTOR_MEMORY \
--executor-cores $EXECUTOR_CORES \
--conf spark.driver.memory=64G \
--driver-memory 4G \
/home/ubuntu/scylla-migrator/scylla-migrator-assembly.jar
2 changes: 1 addition & 1 deletion ansible/templates/submit-cql-job.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ time spark-submit --class com.scylladb.migrator.Migrator \

#sometimes you will need a tuning for driver memory size
#add this config to above to tune it:
# --conf spark.driver.memory=4G \
# --driver-memory 4G \

# debug example
#$SPARK_HOME/spark-submit --class com.scylladb.migrator.Migrator \
Expand Down

0 comments on commit 84f8e9b

Please sign in to comment.