What are you expecting there ... that sounds correct? something else
needs to be closed?

On Tue, Apr 2, 2019 at 9:45 AM Vinoo Ganesh <vgan...@palantir.com> wrote:
>
> Hi All -
>
>    I’ve been digging into the code and looking into what appears to be a 
> memory leak (https://jira.apache.org/jira/browse/SPARK-27337) and have 
> noticed something kind of peculiar about the way closing a SparkSession is 
> handled. Despite being marked as Closeable, closing/stopping a SparkSession 
> simply stops the SparkContext. This changed happened as a result of one of 
> the PRs addressing https://jira.apache.org/jira/browse/SPARK-15073 in 
> https://github.com/apache/spark/pull/12873/files#diff-d91c284798f1c98bf03a31855e26d71cR596.
>
>
>
> I’m trying to understand why this is the intended behavior – anyone have any 
> knowledge of why this is the case?
>
>
>
> Thanks,
>
> Vinoo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to