This one is generated, I suppose, after Ctrl+C

15/09/18 14:38:25 INFO Worker: Asked to kill executor
app-20150918143823-0001/0
15/09/18 14:38:25 INFO Worker: Asked to kill executor
app-20150918143823-0001/0
15/09/18 14:38:25 DEBUG
AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
Actor[akka://sparkWorker/deadLetters]
15/09/18 14:38:25 DEBUG
AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
Actor[akka://sparkWorker/deadLetters]
15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
app-20150918143823-0001/0 interrupted
15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
app-20150918143823-0001/0 interrupted
15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
/dfs/spark/work/app-20150918143823-0001/0/stderr
java.io.IOException: Stream closed
    at
java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
    at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
    at java.io.FilterInputStream.read(FilterInputStream.java:107)
    at
org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
    at
org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
    at
org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
    at
org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
    at
org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
/dfs/spark/work/app-20150918143823-0001/0/stderr
java.io.IOException: Stream closed
    at
java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
    at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
    at java.io.FilterInputStream.read(FilterInputStream.java:107)
    at
org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
    at
org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
    at
org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
    at
org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
    at
org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
15/09/18 14:38:25 DEBUG FileAppender: Closed file
/dfs/spark/work/app-20150918143823-0001/0/stderr
15/09/18 14:38:25 DEBUG FileAppender: Closed file
/dfs/spark/work/app-20150918143823-0001/0/stderr

Petr

Reply via email to