Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Use --driver-memory instead of --conf spark.driver.memory
According to Spark documentation, the configuration property `spark.driver.memory` has no effect in our case (we use the “client” deploy-mode): > In client mode, this config must not be set through the SparkConf directly in your application, because the driver JVM has already started at that point. Instead, please set this through the --driver-memory command line option or in your default properties file. https://spark.apache.org/docs/latest/configuration.html
- Loading branch information