Hi,I am running spark in mesos and getting this error. Can anyone help me
resolve this?Thanks15/03/25 21:05:00 ERROR scheduler.LiveListenerBus:
Listener EventLoggingListener threw an
exceptionjava.lang.reflect.InvocationTargetException    at
sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)     at
org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:203)
at
org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:203)
at scala.Option.foreach(Option.scala:236)       at
org.apache.spark.util.FileLogger.flush(FileLogger.scala:203)    at
org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:90)
at
org.apache.spark.scheduler.EventLoggingListener.onUnpersistRDD(EventLoggingListener.scala:121)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$11.apply(SparkListenerBus.scala:66)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$11.apply(SparkListenerBus.scala:66)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:83)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)   at
org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:81)
at
org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:66)
at
org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
at scala.Option.foreach(Option.scala:236)       at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1460) at
org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)Caused
by: java.io.IOException: Failed to add a datanode.  User may turn off this
feature by setting dfs.client.block.write.replace-datanode-on-failure.policy
in configuration, where the current policy is DEFAULT.  (Nodes:
current=[10.250.100.81:50010, 10.250.100.82:50010],
original=[10.250.100.81:50010, 10.250.100.82:50010])    at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:792)
at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:852)
at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:958)
at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:755)
at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:424)15/03/25
21:05:00 INFO scheduler.ReceivedBlockTracker: Deleting batches
ArrayBuffer()15/03/25 21:05:00 INFO scheduler.ReceivedBlockTracker: Deleting
batches ArrayBuffer(1427316900000 ms)15/03/25 21:05:00 INFO
scheduler.JobGenerator: Checkpointing graph for time 1427317500000
ms15/03/25 21:05:00 INFO streaming.DStreamGraph: Updating checkpoint data
for time 1427317500000 ms15/03/25 21:05:00 INFO streaming.DStreamGraph:
Updated checkpoint data for time 1427317500000 ms



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Exception-Failed-to-add-a-datanode-User-may-turn-off-this-feature-by-setting-dfs-client-block-write-n-tp22231.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to