This repository was archived by the owner on Jan 9, 2020. It is now read-only.
Add quotes around $SPARK_CLASSPATH in Dockerfile java commands #541
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Quotes around $SPARK_CLASSPATH in the java invocations
specified in the Dockerfiles correctly prevent the shell from expanding
wildcard paths in cases where the classpath is a single value like /opt/spark/jars/*
What changes were proposed in this pull request?
Each place where$SPARK_CLASSPATH is used as the value of the -cp flag to $ {JAVA_HOME}/bin/java
in the Dockerfiles under resource-managers/kubernetes/docker-minimal-bundle/src/main/docker has
been changed to include double quotes around $SPARK_CLASSPATH
How was this patch tested?
Manually. Without the change, standard instructions for running a python example fail when the unnecessary --jars flag is left off. With the change, it works correctly.
bin/spark-submit
--deploy-mode cluster
--master k8s://https://:
--kubernetes-namespace
--conf spark.executor.instances=5
--conf spark.app.name=spark-pi
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver-py:v2.2.0-kubernetes-0.5.0
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor-py:v2.2.0-kubernetes-0.5.0
local:///opt/spark/examples/src/main/python/pi.py 10
Error from the spark driver (could be any jar, depends on order of expansion by the shell)
Error: Could not find or load main class .opt.spark.jars.RoaringBitmap-0.5.11.jar
The other images were not tested explicitly, but it's the same case.