What version of Spark are you using? There was a known bug which could be
causing this. It got fixed in Spark 1.3

TD

On Mon, Apr 13, 2015 at 11:44 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> When you say "done fetching documents", does it mean that you are stopping
> the streamingContext? (ssc.stop) or you meant completed fetching documents
> for a batch? If possible, you could paste your custom receiver code so that
> we can have a look at it.
>
> Thanks
> Best Regards
>
> On Tue, Apr 7, 2015 at 8:46 AM, Hari Polisetty <hpoli...@icloud.com>
> wrote:
>
>>  My application is running Spark in local mode and  I have a Spark
>> Streaming Listener as well as a Custom Receiver. When the receiver is done
>> fetching all documents, it invokes “stop” on itself.
>>
>> I see the StreamingListener  getting a callback on “onReceiverStopped”
>> where I stop the streaming context.
>>
>>
>> However, I see the following message in my logs:
>>
>>
>> 2015-04-06 16:41:51,193 WARN [Thread-66]
>> com.amazon.grcs.gapanalysis.spark.streams.ElasticSearchResponseReceiver.onStop
>> - Stopped receiver
>>
>> 2015-04-06 16:41:51,193 ERROR
>> [sparkDriver-akka.actor.default-dispatcher-17]
>> org.apache.spark.Logging$class.logError - Deregistered receiver for stream
>> 0: AlHURLEY
>>
>> 2015-04-06 16:41:51,202 WARN [Executor task launch worker-2]
>> org.apache.spark.Logging$class.logWarning - Stopped executor without error
>>
>> 2015-04-06 16:41:51,203 WARN [StreamingListenerBus]
>> org.apache.spark.Logging$class.logWarning - All of the receivers have not
>> deregistered, Map(0 ->
>> ReceiverInfo(0,ElasticSearchResponseReceiver-0,null,false,localhost,HURLEY))
>>
>>
>> What am I missing or doing wrong?
>>
>
>

Reply via email to