[ 
https://issues.apache.org/jira/browse/SPARK-17265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15850061#comment-15850061
 ] 

Laurent Philippart commented on SPARK-17265:
--------------------------------------------

With Spark 2.2.0-SNAPSHOT (tested on Windows) I am getting netty errors which 
might be due to netty classes not found during the maven build?? Anyway, the 
original issue persists...

*2.2.0-SNAPSHOT was built on linux with*
./dev/make-distribution.sh --name custom-spark --tgz -Pyarn -Phadoop-2.7

*New error with 2.2.0-SNAPSHOT*
17/02/02 15:17:31 ERROR Inbox: Ignoring error
java.lang.NullPointerException
        at 
org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:322)
        at 
org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:72)
        at 
org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:105)
        at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
        at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
        at 
org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
17/02/02 15:17:31 WARN NettyRpcEndpointRef: Error sending message [message = 
RegisterBlockManager(null,956615884,NettyRpcEndpointRef(spark://[email protected]:49933))]
 in 2 attempts
org.apache.spark.SparkException: Exception thrown in awaitResult:
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
        at 
org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:134)
        at 
org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:109)
        at 
org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:63)
        at 
org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:258)
        at 
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$reportHeartBeat(Executor.scala:684)
        at 
org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply$mcV$sp(Executor.scala:709)
        at 
org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:709)
        at 
org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:709)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1960)
        at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:709)
        at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
        at java.util.concurrent.FutureTask.runAndReset(Unknown Source)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown
 Source)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown
 Source)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.NullPointerException
        at 
org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:322)
        at 
org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:72)
        at 
org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:105)
        at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
        at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
        at 
org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
        ... 3 more
17/02/02 15:17:31 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 49955.

*Original issue reproduced with 2.2.0-SNAPSHOT* 
17/02/02 15:17:51 ERROR Executor: Exception in task 2.0 in stage 53.0 (TID 64)
java.lang.ClassCastException: org.apache.spark.graphx.Edge cannot be cast to 
scala.Tuple2
        at 
org.apache.spark.rdd.RDD$$anonfun$subtract$3$$anon$3.getPartition(RDD.scala:991)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:152)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
        at org.apache.spark.scheduler.Task.run(Task.scala:113)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:313)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)

> EdgeRDD Difference throws an exception
> --------------------------------------
>
>                 Key: SPARK-17265
>                 URL: https://issues.apache.org/jira/browse/SPARK-17265
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>         Environment: windows, ubuntu
>            Reporter: Shishir Kharel
>
> Subtracting two edge RDD throws and exception.
> val difference = graph1.edges.subtract(graph2.edges)
> gives
> Exception in thread "main" org.apache.spark.SparkException: Job aborted due 
> to stage failure: Task 1 in stage 0.0 failed 1 times, most recent failure: 
> Lost task 1.0 in stage 0.0 (TID 1, localhost): java.lang.ClassCastException: 
> org.apache.spark.graphx.Edge cannot be cast to scala.Tuple2
>         at 
> org.apache.spark.rdd.RDD$$anonfun$subtract$3$$anon$3.getPartition(RDD.scala:968)
>         at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:152)
>         at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
>         at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
>         at org.apache.spark.scheduler.Task.run(Task.scala:86)
>         at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to