Skip to content
This repository has been archived by the owner on Nov 23, 2017. It is now read-only.

branch-2.0 should use scala 2.11 #91

Open
matthewadams opened this issue Mar 8, 2017 · 2 comments
Open

branch-2.0 should use scala 2.11 #91

matthewadams opened this issue Mar 8, 2017 · 2 comments

Comments

@matthewadams
Copy link

matthewadams commented Mar 8, 2017

http://spark.apache.org/downloads.html says:

Note: Starting version 2.0, Spark is built with Scala 2.11 by default. Scala 2.10 users should download the Spark source package and build with Scala 2.10 support.

It appears as though branch-2.0 still uses scala 2.10, as evidenced from these lines from the log (produced from a spark-ec2 ... launch ... invocation):

Initializing scala
Unpacking Scala
--2017-03-07 23:53:26--  http://s3.amazonaws.com/spark-related-packages/scala-2.10.3.tgz
Resolving s3.amazonaws.com (s3.amazonaws.com)... 52.216.225.123
Connecting to s3.amazonaws.com (s3.amazonaws.com)|52.216.225.123|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 30531249 (29M) [application/x-compressed]
Saving to: ‘scala-2.10.3.tgz’

100%[===========================================================================================>] 30,531,249  2.47MB/s   in 12s

2017-03-07 23:53:39 (2.39 MB/s) - ‘scala-2.10.3.tgz’ saved [30531249/30531249]

I'm still trying to effort whether this is the root cause for the errors I'm seeing when attempting to run-example --master ... SparkPi 10, which look like the following (note java.io.InvalidClassException messages below).

In any case, however, it still seems as though scala 2.11 should be the version installed on master & slave.

17/03/07 20:02:58 INFO spark.SparkContext: Running Spark version 2.0.2
17/03/07 20:02:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/03/07 20:02:59 INFO spark.SecurityManager: Changing view acls to: matthew
17/03/07 20:02:59 INFO spark.SecurityManager: Changing modify acls to: matthew
17/03/07 20:02:59 INFO spark.SecurityManager: Changing view acls groups to:
17/03/07 20:02:59 INFO spark.SecurityManager: Changing modify acls groups to:
17/03/07 20:02:59 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(matthew); groups with view permissions: Set(); users  with modify permissions: Set(matthew); groups with modify permissions: Set()
17/03/07 20:02:59 INFO util.Utils: Successfully started service 'sparkDriver' on port 51946.
17/03/07 20:02:59 INFO spark.SparkEnv: Registering MapOutputTracker
17/03/07 20:02:59 INFO spark.SparkEnv: Registering BlockManagerMaster
17/03/07 20:02:59 INFO storage.DiskBlockManager: Created local directory at /private/var/folders/8c/4kr7cmf109b4778xj0sxct8w0000gn/T/blockmgr-f990ffdf-481d-463c-9e6f-ee7b328bc85c
17/03/07 20:02:59 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
17/03/07 20:02:59 INFO spark.SparkEnv: Registering OutputCommitCoordinator
17/03/07 20:03:00 INFO util.log: Logging initialized @2630ms
17/03/07 20:03:00 INFO server.Server: jetty-9.2.z-SNAPSHOT
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@52045dbe{/jobs,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@674658f7{/jobs/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5c8eee0f{/jobs/job,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@565b064f{/jobs/job/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26425897{/stages,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@73163d48{/stages/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58c34bb3{/stages/stage,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@56a4479a{/stages/stage/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62163b39{/stages/pool,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@20a8a64e{/stages/pool/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62f4ff3b{/storage,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1698fc68{/storage/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4504d271{/storage/rdd,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@207b8649{/storage/rdd/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65b3a85a{/environment,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@34997338{/environment/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@57eda880{/executors,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2b5825fa{/executors/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53d1b9b3{/executors/threadDump,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2cae1042{/executors/threadDump/json,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@163d04ff{/static,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7c209437{/,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2262b621{/api,null,AVAILABLE}
17/03/07 20:03:00 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e928e2f{/stages/stage/kill,null,AVAILABLE}
17/03/07 20:03:00 INFO server.ServerConnector: Started ServerConnector@4678a2eb{HTTP/1.1}{0.0.0.0:4040}
17/03/07 20:03:00 INFO server.Server: Started @2810ms
17/03/07 20:03:00 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
17/03/07 20:03:00 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.86.165:4040
17/03/07 20:03:00 INFO spark.SparkContext: Added JAR file:/Users/matthew/Documents/github/SciSpike/smartcity-cluster/spark-2.0.2/./examples/jars/scopt_2.11-3.3.0.jar at spark://192.168.86.165:51946/jars/scopt_2.11-3.3.0.jar with timestamp 1488938580346
17/03/07 20:03:00 INFO spark.SparkContext: Added JAR file:/Users/matthew/Documents/github/SciSpike/smartcity-cluster/spark-2.0.2/./examples/jars/spark-examples_2.11-2.0.2.jar at spark://192.168.86.165:51946/jars/spark-examples_2.11-2.0.2.jar with timestamp 1488938580347
17/03/07 20:03:00 INFO client.StandaloneAppClient$ClientEndpoint: Connecting to master spark://ec2-52-55-118-26.compute-1.amazonaws.com:7077...
17/03/07 20:03:00 INFO client.TransportClientFactory: Successfully created connection to ec2-52-55-118-26.compute-1.amazonaws.com/52.55.118.26:7077 after 91 ms (0 ms spent in bootstraps)
17/03/07 20:03:00 INFO cluster.StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20170308020300-0004
17/03/07 20:03:00 INFO client.StandaloneAppClient$ClientEndpoint: Executor added: app-20170308020300-0004/0 on worker-20170308013313-172.31.47.189-42583 (172.31.47.189:42583) with 2 cores
17/03/07 20:03:00 INFO cluster.StandaloneSchedulerBackend: Granted executor ID app-20170308020300-0004/0 on hostPort 172.31.47.189:42583 with 2 cores, 1024.0 MB RAM
17/03/07 20:03:00 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51948.
17/03/07 20:03:00 INFO netty.NettyBlockTransferService: Server created on 192.168.86.165:51948
17/03/07 20:03:00 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: org.apache.spark.deploy.DeployMessages$ExecutorUpdated; local class incompatible: stream classdesc serialVersionUID = 3598161183190952796, local class serialVersionUID = 1654279024112373855
	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
	at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:578)
	at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
	at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:180)
	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
	at java.lang.Thread.run(Thread.java:745)
17/03/07 20:03:00 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.86.165, 51948)
17/03/07 20:03:00 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.86.165:51948 with 366.3 MB RAM, BlockManagerId(driver, 192.168.86.165, 51948)
17/03/07 20:03:00 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.86.165, 51948)
17/03/07 20:03:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7c18432b{/metrics/json,null,AVAILABLE}
17/03/07 20:03:01 INFO cluster.StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
17/03/07 20:03:01 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect.
17/03/07 20:03:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@46cf05f7{/SQL,null,AVAILABLE}
17/03/07 20:03:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cd1ac19{/SQL/json,null,AVAILABLE}
17/03/07 20:03:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a80515c{/SQL/execution,null,AVAILABLE}
17/03/07 20:03:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c807b1d{/SQL/execution/json,null,AVAILABLE}
17/03/07 20:03:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c8b96ec{/static/sql,null,AVAILABLE}
17/03/07 20:03:01 INFO internal.SharedState: Warehouse path is 'file:/Users/matthew/Documents/github/SciSpike/smartcity-cluster/spark-2.0.2/spark-warehouse'.
17/03/07 20:03:01 INFO spark.SparkContext: Starting job: reduce at SparkPi.scala:38
17/03/07 20:03:01 INFO scheduler.DAGScheduler: Got job 0 (reduce at SparkPi.scala:38) with 10 output partitions
17/03/07 20:03:01 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:38)
17/03/07 20:03:01 INFO scheduler.DAGScheduler: Parents of final stage: List()
17/03/07 20:03:01 INFO scheduler.DAGScheduler: Missing parents: List()
17/03/07 20:03:01 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents
17/03/07 20:03:01 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1832.0 B, free 366.3 MB)
17/03/07 20:03:01 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1169.0 B, free 366.3 MB)
17/03/07 20:03:01 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.86.165:51948 (size: 1169.0 B, free: 366.3 MB)
17/03/07 20:03:01 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1012
17/03/07 20:03:02 INFO scheduler.DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34)
17/03/07 20:03:02 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 10 tasks
17/03/07 20:03:17 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:03:32 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:03:47 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:04:02 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:04:17 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:04:32 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:04:47 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:05:02 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:05:03 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: org.apache.spark.deploy.DeployMessages$ExecutorUpdated; local class incompatible: stream classdesc serialVersionUID = 3598161183190952796, local class serialVersionUID = 1654279024112373855
	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
	at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:578)
	at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
	at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:180)
	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
	at java.lang.Thread.run(Thread.java:745)
17/03/07 20:05:03 INFO client.StandaloneAppClient$ClientEndpoint: Executor added: app-20170308020300-0004/1 on worker-20170308013313-172.31.47.189-42583 (172.31.47.189:42583) with 2 cores
17/03/07 20:05:03 INFO cluster.StandaloneSchedulerBackend: Granted executor ID app-20170308020300-0004/1 on hostPort 172.31.47.189:42583 with 2 cores, 1024.0 MB RAM
17/03/07 20:05:03 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: org.apache.spark.deploy.DeployMessages$ExecutorUpdated; local class incompatible: stream classdesc serialVersionUID = 3598161183190952796, local class serialVersionUID = 1654279024112373855
	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
	at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:578)
	at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
	at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:180)
	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
	at java.lang.Thread.run(Thread.java:745)
17/03/07 20:05:17 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:05:32 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/03/07 20:05:47 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
@shivaram
Copy link
Contributor

shivaram commented Mar 8, 2017

I am not sure the exception is due to scala version mismatch (does this happen when you run the client on the master machine ?). Supporting scala-2.11 is a necessity though so we can keep this issue open to track that.

@matthewadams
Copy link
Author

does this happen when you run the client on the master machine ?

No, @shivaram, it doesn't. Running the driver program directly on the master appears to work fine.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants