[
https://issues.apache.org/jira/browse/SPARK-11244?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14967778#comment-14967778
]
Shivaram Venkataraman commented on SPARK-11244:
-----------------------------------------------
Good catch -- Could you send a PR for this ?
> sparkR.stop doesn't clean up .sparkRSQLsc in environment
> --------------------------------------------------------
>
> Key: SPARK-11244
> URL: https://issues.apache.org/jira/browse/SPARK-11244
> Project: Spark
> Issue Type: Bug
> Components: SparkR
> Affects Versions: 1.5.1
> Reporter: Sen Fang
>
> Currently {{sparkR.stop}} removes relevant variables from {{.sparkREnv}} for
> SparkContext and backend. However it doesn't clean up {{.sparkRSQLsc}} and
> {{.sparkRHivesc}}.
> It results
> {code}
> sc <- sparkR.init("local")
> sqlContext <- sparkRSQL.init(sc)
> sparkR.stop()
> sc <- sparkR.init("local")
> sqlContext <- sparkRSQL.init(sc)
> sqlContext
> {code}
> producing
> {code}
> sqlContext
> Error in callJMethod(x, "getClass") :
> Invalid jobj 1. If SparkR was restarted, Spark operations need to be
> re-executed.
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]