actually if you search the spark mail archives you will find many similar 
topics. At this time, I just want to manage it by myself.


On Tuesday, August 12, 2014 8:46 PM, Stanley Shi <s...@pivotal.io> wrote:
 


This seems a bug, right? It's not the user's responsibility to manage the 
workers.



On Wed, Aug 13, 2014 at 11:28 AM, S. Zhou <myx...@yahoo.com> wrote:

Sometimes workers are dead but spark context does not know it and still send 
jobs.
>
>
>
>On Tuesday, August 12, 2014 7:14 PM, Stanley Shi <s...@pivotal.io> wrote:
> 
>
>
>Why do you need to detect the worker status in the application? you 
>application generally don't need to know where it is executed.
>
>
>
>On Wed, Aug 13, 2014 at 7:39 AM, S. Zhou <myx...@yahoo.com.invalid> wrote:
>
>I tried to access worker info from spark context but it seems spark context 
>does no expose such API. The reason of doing that is: it seems spark context 
>itself does not have logic to detect if its workers are in dead status. So I 
>like to add such logic by myself. 
>>
>>
>>BTW, it seems spark web UI has some logic of detecting dead workers. But all 
>>relevant classes are declared as private for the spark package scope. 
>>
>>
>>Please let me know how to solve this issue (or if there is an alternative way 
>>to achieve the same purpose)
>>
>>
>>Thanks
>>
>>
>
>
>
>-- 
>
>Regards,
>Stanley Shi,
>
>
>


-- 

Regards,
Stanley Shi,

Reply via email to