Skip to content

Commit f77c10d

Browse files
committed
[SPARK-29923][SQL][TESTS] Set io.netty.tryReflectionSetAccessible for Arrow on JDK9+
### What changes were proposed in this pull request? This PR aims to add `io.netty.tryReflectionSetAccessible=true` to the testing configuration for JDK11 because this is an officially documented requirement of Apache Arrow. Apache Arrow community documented this requirement at `0.15.0` ([ARROW-6206](apache/arrow#5078)). > #### For java 9 or later, should set "-Dio.netty.tryReflectionSetAccessible=true". > This fixes `java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available`. thrown by netty. ### Why are the changes needed? After ARROW-3191, Arrow Java library requires the property `io.netty.tryReflectionSetAccessible` to be set to true for JDK >= 9. After apache#26133, JDK11 Jenkins job seem to fail. - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/676/ - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/677/ - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/678/ ```scala Previous exception in task: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available&#10; io.netty.util.internal.PlatformDependent.directBuffer(PlatformDependent.java:473)&#10; io.netty.buffer.NettyArrowBuf.getDirectBuffer(NettyArrowBuf.java:243)&#10; io.netty.buffer.NettyArrowBuf.nioBuffer(NettyArrowBuf.java:233)&#10; io.netty.buffer.ArrowBuf.nioBuffer(ArrowBuf.java:245)&#10; org.apache.arrow.vector.ipc.message.ArrowRecordBatch.computeBodyLength(ArrowRecordBatch.java:222)&#10; ``` ### Does this PR introduce any user-facing change? No. ### How was this patch tested? Pass the Jenkins with JDK11. Closes apache#26552 from dongjoon-hyun/SPARK-ARROW-JDK11. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent 1112fc6 commit f77c10d

File tree

7 files changed

+10
-8
lines changed

7 files changed

+10
-8
lines changed

R/run-tests.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ FAILED=0
2323
LOGFILE=$FWDIR/unit-tests.out
2424
rm -f $LOGFILE
2525

26-
SPARK_TESTING=1 NOT_CRAN=true $FWDIR/../bin/spark-submit --driver-java-options "-Dlog4j.configuration=file:$FWDIR/log4j.properties" --conf spark.hadoop.fs.defaultFS="file:///" $FWDIR/pkg/tests/run-all.R 2>&1 | tee -a $LOGFILE
26+
SPARK_TESTING=1 NOT_CRAN=true $FWDIR/../bin/spark-submit --driver-java-options "-Dlog4j.configuration=file:$FWDIR/log4j.properties" --conf spark.hadoop.fs.defaultFS="file:///" --conf spark.driver.extraJavaOptions="-Dio.netty.tryReflectionSetAccessible=true" --conf spark.executor.extraJavaOptions="-Dio.netty.tryReflectionSetAccessible=true" $FWDIR/pkg/tests/run-all.R 2>&1 | tee -a $LOGFILE
2727
FAILED=$((PIPESTATUS[0]||$FAILED))
2828

2929
NUM_TEST_WARNING="$(grep -c -e 'Warnings ----------------' $LOGFILE)"

pom.xml

+2-2
Original file line numberDiff line numberDiff line change
@@ -2326,7 +2326,7 @@
23262326
<include>**/*Suite.java</include>
23272327
</includes>
23282328
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
2329-
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize}</argLine>
2329+
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize} -Dio.netty.tryReflectionSetAccessible=true</argLine>
23302330
<environmentVariables>
23312331
<!--
23322332
Setting SPARK_DIST_CLASSPATH is a simple way to make sure any child processes
@@ -2376,7 +2376,7 @@
23762376
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
23772377
<junitxml>.</junitxml>
23782378
<filereports>SparkTestSuite.txt</filereports>
2379-
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize}</argLine>
2379+
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize} -Dio.netty.tryReflectionSetAccessible=true</argLine>
23802380
<stderr/>
23812381
<environmentVariables>
23822382
<!--

project/SparkBuild.scala

+1
Original file line numberDiff line numberDiff line change
@@ -978,6 +978,7 @@ object TestSettings {
978978
javaOptions in Test += "-Dspark.unsafe.exceptionOnMemoryLeak=true",
979979
javaOptions in Test += "-Dsun.io.serialization.extendedDebugInfo=false",
980980
javaOptions in Test += "-Dderby.system.durability=test",
981+
javaOptions in Test += "-Dio.netty.tryReflectionSetAccessible=true",
981982
javaOptions in Test ++= System.getProperties.asScala.filter(_._1.startsWith("spark"))
982983
.map { case (k,v) => s"-D$k=$v" }.toSeq,
983984
javaOptions in Test += "-ea",

python/run-tests.py

+3-2
Original file line numberDiff line numberDiff line change
@@ -86,9 +86,10 @@ def run_individual_python_test(target_dir, test_name, pyspark_python):
8686
env["TMPDIR"] = tmp_dir
8787

8888
# Also override the JVM's temp directory by setting driver and executor options.
89+
java_options = "-Djava.io.tmpdir={0} -Dio.netty.tryReflectionSetAccessible=true".format(tmp_dir)
8990
spark_args = [
90-
"--conf", "spark.driver.extraJavaOptions=-Djava.io.tmpdir={0}".format(tmp_dir),
91-
"--conf", "spark.executor.extraJavaOptions=-Djava.io.tmpdir={0}".format(tmp_dir),
91+
"--conf", "spark.driver.extraJavaOptions='{0}'".format(java_options),
92+
"--conf", "spark.executor.extraJavaOptions='{0}'".format(java_options),
9293
"pyspark-shell"
9394
]
9495
env["PYSPARK_SUBMIT_ARGS"] = " ".join(spark_args)

sql/catalyst/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,7 @@
148148
<groupId>org.scalatest</groupId>
149149
<artifactId>scalatest-maven-plugin</artifactId>
150150
<configuration>
151-
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize}</argLine>
151+
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize} -Dio.netty.tryReflectionSetAccessible=true</argLine>
152152
</configuration>
153153
</plugin>
154154
<plugin>

sql/core/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -177,7 +177,7 @@
177177
<groupId>org.scalatest</groupId>
178178
<artifactId>scalatest-maven-plugin</artifactId>
179179
<configuration>
180-
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize}</argLine>
180+
<argLine>-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize} -Dio.netty.tryReflectionSetAccessible=true</argLine>
181181
</configuration>
182182
</plugin>
183183
<plugin>

sql/hive/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -244,7 +244,7 @@
244244
<artifactId>scalatest-maven-plugin</artifactId>
245245
<configuration>
246246
<!-- Specially disable assertions since some Hive tests fail them -->
247-
<argLine>-da -Xmx4g -XX:ReservedCodeCacheSize=${CodeCacheSize}</argLine>
247+
<argLine>-da -Xmx4g -XX:ReservedCodeCacheSize=${CodeCacheSize} -Dio.netty.tryReflectionSetAccessible=true</argLine>
248248
</configuration>
249249
</plugin>
250250
<plugin>

0 commit comments

Comments
 (0)