Skip to content

Commit

Permalink
[SPARK-14280][BUILD][WIP] Update change-version.sh and pom.xml to add…
Browse files Browse the repository at this point in the history
… Scala 2.12 profiles and enable 2.12 compilation

…build; fix some things that will be warnings or errors in 2.12; restore Scala 2.12 profile infrastructure

## What changes were proposed in this pull request?

This change adds back the infrastructure for a Scala 2.12 build, but does not enable it in the release or Python test scripts.

In order to make that meaningful, it also resolves compile errors that the code hits in 2.12 only, in a way that still works with 2.11.

It also updates dependencies to the earliest minor release of dependencies whose current version does not yet support Scala 2.12. This is in a sense covered by other JIRAs under the main umbrella, but implemented here. The versions below still work with 2.11, and are the _latest_ maintenance release in the _earliest_ viable minor release.

- Scalatest 2.x -> 3.0.3
- Chill 0.8.0 -> 0.8.4
- Clapper 1.0.x -> 1.1.2
- json4s 3.2.x -> 3.4.2
- Jackson 2.6.x -> 2.7.9 (required by json4s)

This change does _not_ fully enable a Scala 2.12 build:

- It will also require dropping support for Kafka before 0.10. Easy enough, just didn't do it yet here
- It will require recreating `SparkILoop` and `Main` for REPL 2.12, which is SPARK-14650. Possible to do here too.

What it does do is make changes that resolve much of the remaining gap without affecting the current 2.11 build.

## How was this patch tested?

Existing tests and build. Manually tested with `./dev/change-scala-version.sh 2.12` to verify it compiles, modulo the exceptions above.

Author: Sean Owen <[email protected]>

Closes apache#18645 from srowen/SPARK-14280.
  • Loading branch information
srowen committed Sep 1, 2017
1 parent 12f0d24 commit 12ab7f7
Show file tree
Hide file tree
Showing 53 changed files with 184 additions and 123 deletions.
22 changes: 11 additions & 11 deletions bin/load-spark-env.cmd
Original file line number Diff line number Diff line change
Expand Up @@ -35,21 +35,21 @@ if [%SPARK_ENV_LOADED%] == [] (

rem Setting SPARK_SCALA_VERSION if not already set.

rem set ASSEMBLY_DIR2="%SPARK_HOME%\assembly\target\scala-2.11"
rem set ASSEMBLY_DIR1="%SPARK_HOME%\assembly\target\scala-2.12"
set ASSEMBLY_DIR2="%SPARK_HOME%\assembly\target\scala-2.11"
set ASSEMBLY_DIR1="%SPARK_HOME%\assembly\target\scala-2.12"

if [%SPARK_SCALA_VERSION%] == [] (

rem if exist %ASSEMBLY_DIR2% if exist %ASSEMBLY_DIR1% (
rem echo "Presence of build for multiple Scala versions detected."
rem echo "Either clean one of them or, set SPARK_SCALA_VERSION=2.11 in spark-env.cmd."
rem exit 1
rem )
rem if exist %ASSEMBLY_DIR2% (
if exist %ASSEMBLY_DIR2% if exist %ASSEMBLY_DIR1% (
echo "Presence of build for multiple Scala versions detected."
echo "Either clean one of them or, set SPARK_SCALA_VERSION in spark-env.cmd."
exit 1
)
if exist %ASSEMBLY_DIR2% (
set SPARK_SCALA_VERSION=2.11
rem ) else (
rem set SPARK_SCALA_VERSION=2.12
rem )
) else (
set SPARK_SCALA_VERSION=2.12
)
)
exit /b 0

Expand Down
22 changes: 11 additions & 11 deletions bin/load-spark-env.sh
Original file line number Diff line number Diff line change
Expand Up @@ -46,18 +46,18 @@ fi

if [ -z "$SPARK_SCALA_VERSION" ]; then

#ASSEMBLY_DIR2="${SPARK_HOME}/assembly/target/scala-2.11"
#ASSEMBLY_DIR1="${SPARK_HOME}/assembly/target/scala-2.12"
ASSEMBLY_DIR2="${SPARK_HOME}/assembly/target/scala-2.11"
ASSEMBLY_DIR1="${SPARK_HOME}/assembly/target/scala-2.12"

#if [[ -d "$ASSEMBLY_DIR2" && -d "$ASSEMBLY_DIR1" ]]; then
# echo -e "Presence of build for multiple Scala versions detected." 1>&2
# echo -e 'Either clean one of them or, export SPARK_SCALA_VERSION=2.11 in spark-env.sh.' 1>&2
# exit 1
#fi
if [[ -d "$ASSEMBLY_DIR2" && -d "$ASSEMBLY_DIR1" ]]; then
echo -e "Presence of build for multiple Scala versions detected." 1>&2
echo -e 'Either clean one of them or, export SPARK_SCALA_VERSION in spark-env.sh.' 1>&2
exit 1
fi

#if [ -d "$ASSEMBLY_DIR2" ]; then
if [ -d "$ASSEMBLY_DIR2" ]; then
export SPARK_SCALA_VERSION="2.11"
#else
# export SPARK_SCALA_VERSION="2.12"
#fi
else
export SPARK_SCALA_VERSION="2.12"
fi
fi
8 changes: 8 additions & 0 deletions core/src/main/scala/org/apache/spark/FutureAction.scala
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,14 @@ trait FutureAction[T] extends Future[T] {
*/
override def value: Option[Try[T]]

// These two methods must be implemented in Scala 2.12, but won't be used by Spark

def transform[S](f: (Try[T]) => Try[S])(implicit executor: ExecutionContext): Future[S] =
throw new UnsupportedOperationException()

def transformWith[S](f: (Try[T]) => Future[S])(implicit executor: ExecutionContext): Future[S] =
throw new UnsupportedOperationException()

/**
* Blocks and returns the result of this job.
*/
Expand Down
6 changes: 3 additions & 3 deletions core/src/main/scala/org/apache/spark/api/java/JavaUtils.scala
Original file line number Diff line number Diff line change
Expand Up @@ -56,9 +56,9 @@ private[spark] object JavaUtils {
val ui = underlying.iterator
var prev : Option[A] = None

def hasNext: Boolean = ui.hasNext
override def hasNext: Boolean = ui.hasNext

def next(): Entry[A, B] = {
override def next(): Entry[A, B] = {
val (k, v) = ui.next()
prev = Some(k)
new ju.Map.Entry[A, B] {
Expand All @@ -74,7 +74,7 @@ private[spark] object JavaUtils {
}
}

def remove() {
override def remove() {
prev match {
case Some(k) =>
underlying match {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ import org.mockito.Mockito._
import org.mockito.invocation.InvocationOnMock
import org.mockito.stubbing.Answer
import org.scalatest.Matchers
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark.SparkFunSuite
import org.apache.spark.internal.Logging
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ import org.openqa.selenium.WebDriver
import org.openqa.selenium.htmlunit.HtmlUnitDriver
import org.scalatest.{BeforeAndAfter, Matchers}
import org.scalatest.concurrent.Eventually
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar
import org.scalatest.selenium.WebBrowser

import org.apache.spark._
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ import org.mockito.Mockito.{inOrder, verify, when}
import org.mockito.invocation.InvocationOnMock
import org.mockito.stubbing.Answer
import org.scalatest.concurrent.Eventually
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark._
import org.apache.spark.TaskState.TaskState
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ import scala.util.{Failure, Success, Try}

import com.google.common.io.CharStreams
import org.mockito.Mockito._
import org.scalatest.ShouldMatchers
import org.scalatest.mock.MockitoSugar
import org.scalatest.Matchers
import org.scalatest.mockito.MockitoSugar

import org.apache.spark.{SecurityManager, SparkConf, SparkFunSuite}
import org.apache.spark.network.{BlockDataManager, BlockTransferService}
Expand All @@ -38,7 +38,7 @@ import org.apache.spark.network.shuffle.BlockFetchingListener
import org.apache.spark.storage.{BlockId, ShuffleBlockId}
import org.apache.spark.util.ThreadUtils

class NettyBlockTransferSecuritySuite extends SparkFunSuite with MockitoSugar with ShouldMatchers {
class NettyBlockTransferSecuritySuite extends SparkFunSuite with MockitoSugar with Matchers {
test("security default off") {
val conf = new SparkConf()
.set("spark.app.id", "app-id")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ import org.apache.spark.network.BlockDataManager
class NettyBlockTransferServiceSuite
extends SparkFunSuite
with BeforeAndAfterEach
with ShouldMatchers {
with Matchers {

private var service0: NettyBlockTransferService = _
private var service1: NettyBlockTransferService = _
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

package org.apache.spark.rpc.netty

import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark._
import org.apache.spark.network.client.TransportClient
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ import org.mockito.Mockito.{never, verify, when}
import org.mockito.invocation.InvocationOnMock
import org.mockito.stubbing.Answer
import org.scalatest.BeforeAndAfterEach
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark._
import org.apache.spark.internal.config
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,10 @@ class TaskContextSuite extends SparkFunSuite with BeforeAndAfter with LocalSpark
val rdd = new RDD[String](sc, List()) {
override def getPartitions = Array[Partition](StubPartition(0))
override def compute(split: Partition, context: TaskContext) = {
context.addTaskCompletionListener(context => TaskContextSuite.completed = true)
context.addTaskCompletionListener(new TaskCompletionListener {
override def onTaskCompletion(context: TaskContext): Unit =
TaskContextSuite.completed = true
})
sys.error("failed")
}
}
Expand Down Expand Up @@ -95,9 +98,13 @@ class TaskContextSuite extends SparkFunSuite with BeforeAndAfter with LocalSpark
test("all TaskCompletionListeners should be called even if some fail") {
val context = TaskContext.empty()
val listener = mock(classOf[TaskCompletionListener])
context.addTaskCompletionListener(_ => throw new Exception("blah"))
context.addTaskCompletionListener(new TaskCompletionListener {
override def onTaskCompletion(context: TaskContext): Unit = throw new Exception("blah")
})
context.addTaskCompletionListener(listener)
context.addTaskCompletionListener(_ => throw new Exception("blah"))
context.addTaskCompletionListener(new TaskCompletionListener {
override def onTaskCompletion(context: TaskContext): Unit = throw new Exception("blah")
})

intercept[TaskCompletionListenerException] {
context.markTaskCompleted(None)
Expand All @@ -109,9 +116,15 @@ class TaskContextSuite extends SparkFunSuite with BeforeAndAfter with LocalSpark
test("all TaskFailureListeners should be called even if some fail") {
val context = TaskContext.empty()
val listener = mock(classOf[TaskFailureListener])
context.addTaskFailureListener((_, _) => throw new Exception("exception in listener1"))
context.addTaskFailureListener(new TaskFailureListener {
override def onTaskFailure(context: TaskContext, error: Throwable): Unit =
throw new Exception("exception in listener1")
})
context.addTaskFailureListener(listener)
context.addTaskFailureListener((_, _) => throw new Exception("exception in listener3"))
context.addTaskFailureListener(new TaskFailureListener {
override def onTaskFailure(context: TaskContext, error: Throwable): Unit =
throw new Exception("exception in listener3")
})

val e = intercept[TaskCompletionListenerException] {
context.markTaskFailed(new Exception("exception in task"))
Expand Down Expand Up @@ -232,7 +245,10 @@ class TaskContextSuite extends SparkFunSuite with BeforeAndAfter with LocalSpark
var invocations = 0
val context = TaskContext.empty()
context.markTaskCompleted(None)
context.addTaskCompletionListener(_ => invocations += 1)
context.addTaskCompletionListener(new TaskCompletionListener {
override def onTaskCompletion(context: TaskContext): Unit =
invocations += 1
})
assert(invocations == 1)
context.markTaskCompleted(None)
assert(invocations == 1)
Expand All @@ -244,10 +260,12 @@ class TaskContextSuite extends SparkFunSuite with BeforeAndAfter with LocalSpark
val error = new RuntimeException
val context = TaskContext.empty()
context.markTaskFailed(error)
context.addTaskFailureListener { (_, e) =>
lastError = e
invocations += 1
}
context.addTaskFailureListener(new TaskFailureListener {
override def onTaskFailure(context: TaskContext, e: Throwable): Unit = {
lastError = e
invocations += 1
}
})
assert(lastError == error)
assert(invocations == 1)
context.markTaskFailed(error)
Expand All @@ -267,9 +285,15 @@ class TaskContextSuite extends SparkFunSuite with BeforeAndAfter with LocalSpark
test("all TaskCompletionListeners should be called even if some fail or a task") {
val context = TaskContext.empty()
val listener = mock(classOf[TaskCompletionListener])
context.addTaskCompletionListener(_ => throw new Exception("exception in listener1"))
context.addTaskCompletionListener(new TaskCompletionListener {
override def onTaskCompletion(context: TaskContext): Unit =
throw new Exception("exception in listener1")
})
context.addTaskCompletionListener(listener)
context.addTaskCompletionListener(_ => throw new Exception("exception in listener3"))
context.addTaskCompletionListener(new TaskCompletionListener {
override def onTaskCompletion(context: TaskContext): Unit =
throw new Exception("exception in listener3")
})

val e = intercept[TaskCompletionListenerException] {
context.markTaskCompleted(Some(new Exception("exception in task")))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ import scala.collection.mutable.HashMap
import org.mockito.Matchers.{anyInt, anyObject, anyString, eq => meq}
import org.mockito.Mockito.{atLeast, atMost, never, spy, times, verify, when}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark._
import org.apache.spark.internal.Logging
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ package org.apache.spark.storage

import org.mockito.Matchers
import org.mockito.Mockito._
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark.SparkFunSuite
import org.apache.spark.memory.MemoryMode.ON_HEAP
Expand Down
2 changes: 2 additions & 0 deletions dev/create-release/release-build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -349,6 +349,8 @@ if [[ "$1" == "publish-release" ]]; then
# Clean-up Zinc nailgun process
/usr/sbin/lsof -P |grep $ZINC_PORT | grep LISTEN | awk '{ print $2; }' | xargs kill

#./dev/change-scala-version.sh 2.11

pushd $tmp_repo/org/apache/spark

# Remove any extra files generated during install
Expand Down
6 changes: 3 additions & 3 deletions dev/deps/spark-deps-hadoop-2.6
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ breeze_2.11-0.13.2.jar
calcite-avatica-1.2.0-incubating.jar
calcite-core-1.2.0-incubating.jar
calcite-linq4j-1.2.0-incubating.jar
chill-java-0.8.0.jar
chill_2.11-0.8.0.jar
chill-java-0.8.4.jar
chill_2.11-0.8.4.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2.jar
Expand Down Expand Up @@ -168,7 +168,7 @@ scala-compiler-2.11.8.jar
scala-library-2.11.8.jar
scala-parser-combinators_2.11-1.0.4.jar
scala-reflect-2.11.8.jar
scala-xml_2.11-1.0.2.jar
scala-xml_2.11-1.0.5.jar
scalap-2.11.8.jar
shapeless_2.11-2.3.2.jar
slf4j-api-1.7.16.jar
Expand Down
6 changes: 3 additions & 3 deletions dev/deps/spark-deps-hadoop-2.7
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ breeze_2.11-0.13.2.jar
calcite-avatica-1.2.0-incubating.jar
calcite-core-1.2.0-incubating.jar
calcite-linq4j-1.2.0-incubating.jar
chill-java-0.8.0.jar
chill_2.11-0.8.0.jar
chill-java-0.8.4.jar
chill_2.11-0.8.4.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2.jar
Expand Down Expand Up @@ -169,7 +169,7 @@ scala-compiler-2.11.8.jar
scala-library-2.11.8.jar
scala-parser-combinators_2.11-1.0.4.jar
scala-reflect-2.11.8.jar
scala-xml_2.11-1.0.2.jar
scala-xml_2.11-1.0.5.jar
scalap-2.11.8.jar
shapeless_2.11-2.3.2.jar
slf4j-api-1.7.16.jar
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ import org.mockito.invocation.InvocationOnMock
import org.mockito.stubbing.Answer
import org.scalatest.{BeforeAndAfterEach, PrivateMethodTester}
import org.scalatest.concurrent.Eventually
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark.streaming.{Duration, TestSuiteBase}
import org.apache.spark.util.ManualClock
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,13 +17,10 @@

package org.apache.spark.streaming.kinesis

import java.lang.IllegalArgumentException

import com.amazonaws.services.kinesis.clientlibrary.lib.worker.InitialPositionInStream
import org.scalatest.BeforeAndAfterEach
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark.SparkFunSuite
import org.apache.spark.storage.StorageLevel
import org.apache.spark.streaming.{Seconds, StreamingContext, TestSuiteBase}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ import org.mockito.Matchers._
import org.mockito.Matchers.{eq => meq}
import org.mockito.Mockito._
import org.scalatest.{BeforeAndAfter, Matchers}
import org.scalatest.mock.MockitoSugar
import org.scalatest.mockito.MockitoSugar

import org.apache.spark.streaming.{Duration, TestSuiteBase}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -231,17 +231,17 @@ String getScalaVersion() {
return scala;
}
String sparkHome = getSparkHome();
//File scala212 = new File(sparkHome, "launcher/target/scala-2.12");
File scala212 = new File(sparkHome, "launcher/target/scala-2.12");
File scala211 = new File(sparkHome, "launcher/target/scala-2.11");
//checkState(!scala210.isDirectory() || !scala211.isDirectory(),
// "Presence of build for multiple Scala versions detected.\n" +
// "Either clean one of them or set SPARK_SCALA_VERSION in your environment.");
//if (scala212.isDirectory()) {
// return "2.12";
//} else {
checkState(scala211.isDirectory(), "Cannot find any build directories.");
return "2.11";
//}
checkState(!scala212.isDirectory() || !scala211.isDirectory(),
"Presence of build for multiple Scala versions detected.\n" +
"Either clean one of them or set SPARK_SCALA_VERSION in your environment.");
if (scala212.isDirectory()) {
return "2.12";
} else {
checkState(scala211.isDirectory(), "Cannot find any build directories.");
return "2.11";
}
}

String getSparkHome() {
Expand Down
Loading

0 comments on commit 12ab7f7

Please sign in to comment.