Hey Neil,

there were two Yarn jobs running related to your notebooks, I just killed
them, let's see if it solves the problem (you might need to restart again
your notebook). If not, let's open a task and investigate :)

Luca

Il giorno gio 6 feb 2020 alle ore 02:08 Neil Shah-Quinn <
[email protected]> ha scritto:

> Whoa—I just got the same stopped SparkContext error on the query even
> after restarting the notebook, without an intermediate Java heap space
> error. That seems very strange to me.
>
> On Wed, 5 Feb 2020 at 16:09, Neil Shah-Quinn <[email protected]>
> wrote:
>
>> Hey there!
>>
>> I was running SQL queries via PySpark (using the wmfdata package
>> <https://github.com/neilpquinn/wmfdata/blob/master/wmfdata/hive.py>) on
>> SWAP when one of my queries failed with "java.lang.OutofMemoryError: Java
>> heap space".
>>
>> After that, when I tried to call the spark.sql function again (via
>> wmfdata.hive.run), it failed with "java.lang.IllegalStateException: Cannot
>> call methods on a stopped SparkContext."
>>
>> When I tried to create a new Spark context using
>> SparkSession.builder.getOrCreate (whether using wmfdata.spark.get_session
>> or directly), it returned a SparkContent object properly, but calling the
>> object's sql function still gave the "stopped SparkContext error".
>>
>> Any idea what's going on? I assume restarting the notebook kernel would
>> take care of the problem, but it seems like there has to be a better way to
>> recover.
>>
>> Thank you!
>>
>>
>> _______________________________________________
> Analytics mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/analytics
>
_______________________________________________
Analytics mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to