Please also see this thread: http://search-hadoop.com/m/q3RTtGpLeLyv97B1

On Sun, Sep 13, 2015 at 9:49 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> For #1, there is the following method:
>
>   @DeveloperApi
>   def getExecutorStorageStatus: Array[StorageStatus] = {
>     assertNotStopped()
>
> You can wrap the call in try block catching IllegalStateException.
> Of course, this is just a workaround.
>
> FYI
>
> On Sun, Sep 13, 2015 at 1:48 AM, Ophir Cohen <oph...@gmail.com> wrote:
>
>> Hi,
>> I'm working on my companie's system that constructs out of Spark,
>> Zeppelin, Hive and some other technology and wonder regarding to ability to
>> stop contexts.
>>
>> Working on the test framwork for the system, when run tests someting I
>> would like to create new SparkContext in order to run the tests on 'clean'
>> context.
>> I found it hard to do as, first of all, I couldn't find any way to
>> understand of SparkContext is already stopped. It has private flag for that
>> but its private.
>> Anther problem is that when creating local HiveContext it initialize
>> derby instance. when trying to create new HiveContext it fails cause the DB
>> already exists.
>> Apperantly, there isn't anyway to tell HiveContext to stop and clear its
>> connection to the DB.
>>
>> Essintelly I'm looking for two things:
>> 1. Way to understand if SparkContext stopped already or not.
>> 2. Way to stop/close HiveContext that will close relevant
>> files/connection and release the resources.
>>
>> Thanks,
>> Ophir
>>
>
>

Reply via email to