Hi. I try to run the Spark Pi on the cluster, some strange errors happen and
I do not know what cause the error. Although I have posted this error to the
user@spark, I think it may be not a simple configuration error and the
developers may know it well.
  
  When I am using the hadoop2.6 and spark-1.5.1-bin-hadoop2.6 the error log
is below: 


118 10/01/01 11:59:14 ERROR yarn.ApplicationMaster: User class threw
exception: java.lang.reflect.InvocationTargetException 

119 java.lang.reflect.InvocationTargetException 

Caused by: java.lang.IllegalArgumentException:
java.lang.UnsatisfiedLinkError:
/opt/hadoop/tmp/nm-local-dir/usercache/root/appcache/application_1444839345484_0006/container_1444839345484_0006_01_000001/tmp/snappy-1.0.4.1-8378427e-4d5c-42b1-ae49-c9600
c204bd7-libsnappyjava.so:
/opt/hadoop/tmp/nm-local-dir/usercache/root/appcache/application_1444839345484_0006/container_14448
39345484_0006_01_000001/tmp/*snappy-1.0.4.1-8378427e-4d5c-42b1-ae49-c9600c204bd7-libsnappyjava.so:
cannot open shared object file: No such file or directory*



  When I am using the hadoop2.6 and spark-1.3.0-bin-hadoop2.4 the error
seems different and the log is below: 


*   108 org.apache.spark.SparkException: Job aborted due to stage failure:
Task serialization failed: java.lang.reflect.InvocationTargetException*
    109 sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method) 
    110
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 
    111
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 
    112 java.lang.reflect.Constructor.newInstance(Constructor.java:408) 
    113
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:68) 
    114
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60) 
    115
org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
 
    116
org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:79) 
    117
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
 
    118
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
 
    119
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
 
    120 org.apache.spark.SparkContext.broadcast(SparkContext.scala:1051) 
    121
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:839)
 
    122
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:778)
 
    123
org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:762)
 
    124
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1362)
 
    125
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
 
    126 org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) 


What causes the error? the java compatibility or the hadoop compatibility? 
Thank you for your help 



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Strange-spark-problems-among-different-versions-tp14609.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to