[
https://issues.apache.org/jira/browse/SPARK-16417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15382061#comment-15382061
]
ren xing commented on SPARK-16417:
----------------------------------
No. I did nothing to the spark code that i used in my env. I fixed it in my
application. I just generated the blocks myself(in a customized spark
receiver)with using the store(ArrayBuffer) function and handling the exceptions
myself.
> spark 1.5.2 receiver store(single-record) with ahead log enabled makes
> executor crash if there is an exception when BlockGenerator storing block
> ------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-16417
> URL: https://issues.apache.org/jira/browse/SPARK-16417
> Project: Spark
> Issue Type: Bug
> Components: Streaming
> Affects Versions: 1.5.2
> Environment: spark streaming version 1.5.2.
> Reporter: ren xing
>
> receiver has the store(single-record) function which actually puts the record
> to a buffer. One backend thread will periodically search this buffer and
> generate a block and store this block to spark. If enabled the ahead log,
> sometimes there be an exception when writing the ahead log. This exception
> will be caught by the backend thread. However the backend thread just print
> some message AND EXIT! This means there will be no consumer for the receiver
> inner buffered records. As time goes on, the executor will be OOM
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]