[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-05 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/22919


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-05 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230701471
  
--- Diff: bin/spark-shell ---
@@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
   source "$(dirname "$0")"/find-spark-home
 fi
 
-export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
+export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
+
+Scala REPL options:
+  -Ipreload , enforcing line-by-line 
interpretation"
--- End diff --

Yea .. but `-i` doesn't handle implicits like toDF or symbols which are 
pretty basic ones. I think we should better avoid to document it for now.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-05 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230695150
  
--- Diff: bin/spark-shell ---
@@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
   source "$(dirname "$0")"/find-spark-home
 fi
 
-export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
+export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
+
+Scala REPL options:
+  -Ipreload , enforcing line-by-line 
interpretation"
--- End diff --

Shall we also define `-i` behavior here? I think for now this option is 
also accepted by the REPL.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-05 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230689462
  
--- Diff: bin/spark-shell ---
@@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
   source "$(dirname "$0")"/find-spark-home
 fi
 
-export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
+export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
+
+Scala REPL options:
+  -Ipreload , enforcing line-by-line 
interpretation"
--- End diff --

Oh haha. Sorry. That's in `SparkSubmitArguments.printUsageAndExit`.thats 
why I left https://github.com/apache/spark/pull/22919#discussion_r230554455


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-05 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230686892
  
--- Diff: bin/spark-shell ---
@@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
   source "$(dirname "$0")"/find-spark-home
 fi
 
-export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
+export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
+
+Scala REPL options:
+  -Ipreload , enforcing line-by-line 
interpretation"
--- End diff --

I mean, I didn't find
```
Options:
  --master MASTER_URL spark://host:port, mesos://host:port, yarn,
  k8s://https://host:port, or local (Default: 
local[*]).
  --deploy-mode DEPLOY_MODE   Whether to launch the driver program locally 
("client") or
  on one of the worker machines inside the 
cluster ("cluster")
  (Default: client).
```

in the shell script. Where do we define them?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-05 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230675018
  
--- Diff: bin/spark-shell ---
@@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
   source "$(dirname "$0")"/find-spark-home
 fi
 
-export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
+export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
+
+Scala REPL options:
+  -Ipreload , enforcing line-by-line 
interpretation"
--- End diff --

I tested other options and this one looks only the valid one. I described 
in PR description.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-04 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230655513
  
--- Diff: bin/spark-shell ---
@@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
   source "$(dirname "$0")"/find-spark-home
 fi
 
-export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
+export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
+
+Scala REPL options:
+  -Ipreload , enforcing line-by-line 
interpretation"
--- End diff --

where do we define other options?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-03 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230554455
  
--- Diff: bin/spark-shell2.cmd ---
@@ -20,7 +20,13 @@ rem
 rem Figure out where the Spark framework is installed
 call "%~dp0find-spark-home.cmd"
 
-set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd [options]
+set LF=^
+
+
+rem two empty lines are required
+set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd 
[options]^%LF%%LF%^%LF%%LF%^
+Scala REPL options:^%LF%%LF%^
--- End diff --

Script specific information looks included in `_SPARK_CMD_USAGE` - looks 
it's appropriate place then somewhere in 
`SparkSubmitArguments.printUsageAndExit`.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-03 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22919#discussion_r230554256
  
--- Diff: bin/spark-shell2.cmd ---
@@ -20,7 +20,13 @@ rem
 rem Figure out where the Spark framework is installed
 call "%~dp0find-spark-home.cmd"
 
-set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd [options]
+set LF=^
+
+
+rem two empty lines are required
+set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd 
[options]^%LF%%LF%^%LF%%LF%^
+Scala REPL options:^%LF%%LF%^
--- End diff --

There seems no claver way then this to set newlines in variables in batch 
files.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-03 Thread HyukjinKwon
GitHub user HyukjinKwon reopened a pull request:

https://github.com/apache/spark/pull/22919

[SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell

## What changes were proposed in this pull request?

Looks we mistakenly changed `-i` option behaviour at `spark-shell`. This PR 
targets to restore the previous option and behaviour.

The root cause seems to be 
https://github.com/scala/scala/commit/99dad60d984d3f72338f3bad4c4fe905090edd51. 
They change what `-i` means in that commit. `-i` option is replaced to `-I`.

The _newly replaced_ option -i at Scala 2.11.12 works like `:paste` 
(previously it worked like `:load`). `:paste` looks not working with implicits 
- at least I verified Spark 2.4.0, 2.3.2, 2.0.0, and the current master:

```bash
scala> :paste test.scala
Pasting file test.scala...
:19: error: value toDF is not a member of 
org.apache.spark.rdd.RDD[Record]
Error occurred in an application involving default arguments.
   spark.sparkContext.parallelize((1 to 2).map(i => Record(i, 
s"val_$i"))).toDF.show

   ^
```

Note that `./bin/spark-shell --help` does not describe this option so I 
guess it's not explicitly documented (I guess?); however, it's best not to 
break. The changes are only two lines.

In particular, we should backport this to branch-2.4.

## How was this patch tested?

Manually tested.


With the input below:

```bash
$ cat test.scala
spark.version
case class Record(key: Int, value: String)
spark.sparkContext.parallelize((1 to 2).map(i => Record(i, 
s"val_$i"))).toDF.show
```

**Spark 2.3.2:**

```scala
$ bin/spark-shell -i test.scala
...
+---+-+
|key|value|
+---+-+
|  1|val_1|
|  2|val_2|
+---+-+
```

**Before:**

```scala
$ bin/spark-shell -i test.scala
...
test.scala:17: error: value toDF is not a member of 
org.apache.spark.rdd.RDD[Record]
Error occurred in an application involving default arguments.
   spark.sparkContext.parallelize((1 to 2).map(i => Record(i, 
s"val_$i"))).toDF.show

   ^
```

**After:**

```scala
$ ./bin/spark-shell -i test.scala
...
+---+-+
|key|value|
+---+-+
|  1|val_1|
|  2|val_2|
+---+-+
```


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/HyukjinKwon/spark SPARK-25906

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/22919.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #22919


commit 5f3cb87c8798e72cc6852e71c02ffc2077c748d7
Author: hyukjinkwon 
Date:   2018-11-03T12:48:25Z

Documents '-I' option (from Scala REPL) in spark-shell




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22919: [SPARK-25906][SHELL] Documents '-I' option (from ...

2018-11-03 Thread HyukjinKwon
Github user HyukjinKwon closed the pull request at:

https://github.com/apache/spark/pull/22919


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org