We have tried on another cluster installation with the same effect.

Petr

On Mon, Sep 21, 2015 at 10:45 AM, Petr Novak <oss.mli...@gmail.com> wrote:

> It might be connected with my problems with gracefulShutdown in Spark
>> 1.5.0 2.11
>> https://mail.google.com/mail/#search/petr/14fb6bd5166f9395
>>
>> Maybe Ctrl+C corrupts checkpoints and breaks gracefulShutdown?
>>
> The provided link is obviously wrong. I haven't found it Spark mailing
> lists archive for some reason so you have to search in your mailbox for
> "Spark Streaming stop gracefully doesn't return to command line after
> upgrade to 1.4.0 and beyond"
>
> These 2 issues block us from upgrading to 1.5.0 from 1.3.0. Just having
> non-graceful shutdown which can recover from the last completed batch would
> be enough because our computation is idempotent. I just wonder why nobody
> has the same issue, it suggests that there is either something wrong on our
> side or that nobody is using KafkaDirectStream with Spark build-in
> checkpointing in production?
>
> I would just need a confirmation from community that checkpointing and
> graceful shutdown is actually working with KafkaDirectStream on 1.5.0 so
> that I can look for a problem on my side.
>
> Many thanks,
>
> Petr
>
> On Sun, Sep 20, 2015 at 12:58 PM, Petr Novak <oss.mli...@gmail.com> wrote:
>
>> Hi Michal,
>> yes, it is there logged twice, it can be seen in attached log in one of
>> previous post with more details:
>>
>> 15/09/17 23:06:37 INFO StreamingContext: Invoking
>> stop(stopGracefully=false) from shutdown hook
>> 15/09/17 23:06:37 INFO StreamingContext: Invoking
>> stop(stopGracefully=false) from shutdown hook
>>
>> Thanks,
>> Petr
>>
>> On Sat, Sep 19, 2015 at 4:01 AM, Michal Čizmazia <mici...@gmail.com>
>> wrote:
>>
>>> Hi Petr, after Ctrl+C can you see the following message in the logs?
>>>
>>> Invoking stop(stopGracefully=false)
>>>
>>> Details:
>>> https://github.com/apache/spark/pull/6307
>>>
>>>
>>> On 18 September 2015 at 10:28, Petr Novak <oss.mli...@gmail.com> wrote:
>>>
>>>> It might be connected with my problems with gracefulShutdown in Spark
>>>> 1.5.0 2.11
>>>> https://mail.google.com/mail/#search/petr/14fb6bd5166f9395
>>>>
>>>> Maybe Ctrl+C corrupts checkpoints and breaks gracefulShutdown?
>>>>
>>>> Petr
>>>>
>>>> On Fri, Sep 18, 2015 at 4:10 PM, Petr Novak <oss.mli...@gmail.com>
>>>> wrote:
>>>>
>>>>> ...to ensure it is not something wrong on my cluster.
>>>>>
>>>>> On Fri, Sep 18, 2015 at 4:09 PM, Petr Novak <oss.mli...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I have tried it on Spark 1.3.0 2.10 and it works. The same code
>>>>>> doesn't on Spark 1.5.0 2.11. It would be nice if anybody could try on
>>>>>> another installation to ensure it is something wrong on my cluster.
>>>>>>
>>>>>> Many thanks,
>>>>>> Petr
>>>>>>
>>>>>> On Fri, Sep 18, 2015 at 4:07 PM, Petr Novak <oss.mli...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> This one is generated, I suppose, after Ctrl+C
>>>>>>>
>>>>>>> 15/09/18 14:38:25 INFO Worker: Asked to kill executor
>>>>>>> app-20150918143823-0001/0
>>>>>>> 15/09/18 14:38:25 INFO Worker: Asked to kill executor
>>>>>>> app-20150918143823-0001/0
>>>>>>> 15/09/18 14:38:25 DEBUG
>>>>>>> AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
>>>>>>> message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
>>>>>>> Actor[akka://sparkWorker/deadLetters]
>>>>>>> 15/09/18 14:38:25 DEBUG
>>>>>>> AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
>>>>>>> message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
>>>>>>> Actor[akka://sparkWorker/deadLetters]
>>>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
>>>>>>> app-20150918143823-0001/0 interrupted
>>>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
>>>>>>> app-20150918143823-0001/0 interrupted
>>>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
>>>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
>>>>>>> 15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
>>>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>>>> java.io.IOException: Stream closed
>>>>>>>     at
>>>>>>> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>>>>>>>     at
>>>>>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>>>>>>>     at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>>>>>     at java.io.FilterInputStream.read(FilterInputStream.java:107)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
>>>>>>> 15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
>>>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>>>> java.io.IOException: Stream closed
>>>>>>>     at
>>>>>>> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>>>>>>>     at
>>>>>>> java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>>>>>>>     at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>>>>>     at java.io.FilterInputStream.read(FilterInputStream.java:107)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>>>>>>>     at
>>>>>>> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
>>>>>>> 15/09/18 14:38:25 DEBUG FileAppender: Closed file
>>>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>>>> 15/09/18 14:38:25 DEBUG FileAppender: Closed file
>>>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>>>>
>>>>>>> Petr
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to