[ https://issues.apache.org/jira/browse/SPARK-6335?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14362340#comment-14362340 ]
Marko Bonaci commented on SPARK-6335: ------------------------------------- Yes, I know, but my thought process was: Since it's not pure scala shell, i.e. it's already been fiddled with scala shell in order to adapt it to Spark, and since you [cannot create SparkContext manually|http://spark.apache.org/docs/latest/programming-guide.html#using-the-shell] in the shell, I don't see the point in destroying it and staying inside _spark-shell_. Not 100% sure that it makes sense, though. > REPL :reset command also removes refs to SparkContext and SQLContext > -------------------------------------------------------------------- > > Key: SPARK-6335 > URL: https://issues.apache.org/jira/browse/SPARK-6335 > Project: Spark > Issue Type: Improvement > Components: Spark Shell > Affects Versions: 1.3.0 > Environment: Ubuntu 14.04 64-bit; spark-1.3.0-bin-hadoop2.4 > Reporter: Marko Bonaci > Priority: Trivial > > I wasn't sure whether to mark it as a bug or an improvement, so I went for > more moderate option, since this is rather trivial, rarely used thing. > Here's the repl printout: > {code:java} > 15/03/14 14:39:38 INFO SparkILoop: Created spark context.. > Spark context available as sc. > 15/03/14 14:39:38 INFO SparkILoop: Created sql context (with Hive support).. > SQL context available as sqlContext. > scala> val x = 8 > x: Int = 8 > scala> :reset > Resetting repl state. > Forgetting this session history: > val x = 8 > Forgetting all expression results and named terms: $intp, sc, sqlContext, x > scala> sc.parallelize(1 to 8) > <console>:8: error: not found: value sc > sc.parallelize(1 to 8) > ^ > scala> :quit > Stopping spark context. > <console>:8: error: not found: value sc > sc.stop() > ^ > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org