Re: Blocked REPL commands

2015-11-19 Thread Jakob Odersky
that definitely looks like a bug, go ahead with filing an issue
I'll check the scala repl source code to see what, if any, other commands
there are that should be disabled

On 19 November 2015 at 12:54, Jacek Laskowski  wrote:

> Hi,
>
> Dunno the answer, but :reset should be blocked, too, for obvious reasons.
>
> ➜  spark git:(master) ✗ ./bin/spark-shell
> ...
> Welcome to
>     __
>  / __/__  ___ _/ /__
> _\ \/ _ \/ _ `/ __/  '_/
>/___/ .__/\_,_/_/ /_/\_\   version 1.6.0-SNAPSHOT
>   /_/
>
> Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java
> 1.8.0_66)
> Type in expressions to have them evaluated.
> Type :help for more information.
>
> scala> :reset
> Resetting interpreter state.
> Forgetting this session history:
>
>
>  @transient val sc = {
>val _sc = org.apache.spark.repl.Main.createSparkContext()
>println("Spark context available as sc.")
>_sc
>  }
>
>
>  @transient val sqlContext = {
>val _sqlContext = org.apache.spark.repl.Main.createSQLContext()
>println("SQL context available as sqlContext.")
>_sqlContext
>  }
>
> import org.apache.spark.SparkContext._
> import sqlContext.implicits._
> import sqlContext.sql
> import org.apache.spark.sql.functions._
> ...
>
> scala> import org.apache.spark._
> import org.apache.spark._
>
> scala> val sc = new SparkContext("local[*]", "shell", new SparkConf)
> ...
> org.apache.spark.SparkException: Only one SparkContext may be running
> in this JVM (see SPARK-2243). To ignore this error, set
> spark.driver.allowMultipleContexts = true. The currently running
> SparkContext was created at:
> org.apache.spark.SparkContext.(SparkContext.scala:82)
> ...
>
> Guess I should file an issue?
>
> Pozdrawiam,
> Jacek
>
> --
> Jacek Laskowski | https://medium.com/@jaceklaskowski/ |
> http://blog.jaceklaskowski.pl
> Mastering Apache Spark
> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
> Follow me at https://twitter.com/jaceklaskowski
> Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>
>
> On Thu, Nov 19, 2015 at 8:44 PM, Jakob Odersky  wrote:
> > I was just going through the spark shell code and saw this:
> >
> > private val blockedCommands = Set("implicits", "javap", "power",
> "type",
> > "kind")
> >
> > What is the reason as to why these commands are blocked?
> >
> > thanks,
> > --Jakob
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Blocked REPL commands

2015-11-19 Thread Jacek Laskowski
Hi,

Dunno the answer, but :reset should be blocked, too, for obvious reasons.

➜  spark git:(master) ✗ ./bin/spark-shell
...
Welcome to
    __
 / __/__  ___ _/ /__
_\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0-SNAPSHOT
  /_/

Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_66)
Type in expressions to have them evaluated.
Type :help for more information.

scala> :reset
Resetting interpreter state.
Forgetting this session history:


 @transient val sc = {
   val _sc = org.apache.spark.repl.Main.createSparkContext()
   println("Spark context available as sc.")
   _sc
 }


 @transient val sqlContext = {
   val _sqlContext = org.apache.spark.repl.Main.createSQLContext()
   println("SQL context available as sqlContext.")
   _sqlContext
 }

import org.apache.spark.SparkContext._
import sqlContext.implicits._
import sqlContext.sql
import org.apache.spark.sql.functions._
...

scala> import org.apache.spark._
import org.apache.spark._

scala> val sc = new SparkContext("local[*]", "shell", new SparkConf)
...
org.apache.spark.SparkException: Only one SparkContext may be running
in this JVM (see SPARK-2243). To ignore this error, set
spark.driver.allowMultipleContexts = true. The currently running
SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:82)
...

Guess I should file an issue?

Pozdrawiam,
Jacek

--
Jacek Laskowski | https://medium.com/@jaceklaskowski/ |
http://blog.jaceklaskowski.pl
Mastering Apache Spark
https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski


On Thu, Nov 19, 2015 at 8:44 PM, Jakob Odersky  wrote:
> I was just going through the spark shell code and saw this:
>
> private val blockedCommands = Set("implicits", "javap", "power", "type",
> "kind")
>
> What is the reason as to why these commands are blocked?
>
> thanks,
> --Jakob

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Blocked REPL commands

2015-11-19 Thread Jakob Odersky
I was just going through the spark shell code and saw this:

private val blockedCommands = Set("implicits", "javap", "power",
"type", "kind")

What is the reason as to why these commands are blocked?

thanks,
--Jakob