GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/22919

    [SPARK-25906][SHELL] Revive -i option in spark-shell

    ## What changes were proposed in this pull request?
    
    Looks we mistakenly removed `-i` option at `spark-shell`. This PR targets 
to restore the previous option and behaviour.
    
    The root cause seems to be 
https://github.com/scala/scala/commit/99dad60d984d3f72338f3bad4c4fe905090edd51. 
They change what `-i` means in that commit. `-i` option is replaced to `-I`.
    
    The _newly replaced_ option -i at Scala 2.11.12 works like `:paste` 
(previously it worked like `:load`). `:paste` looks not working so far - at 
least I verified Spark 2.4.0, 2.3.2, 2.0.0, and the current master:
    
    ```
    scala> :paste test.scala
    Pasting file test.scala...
    <console>:19: error: value toDF is not a member of 
org.apache.spark.rdd.RDD[Record]
    Error occurred in an application involving default arguments.
           spark.sparkContext.parallelize((1 to 2).map(i => Record(i, 
s"val_$i"))).toDF.show
                                                                                
   ^
    ```
    
    Note that `./bin/spark-shell --help` does not describe this option so I 
guess it's not explicitly documented (I guess?); however, it's best not to 
break. The changes are only two lines.
    
    ## How was this patch tested?
    
    Manually tested.
    
    
    With the input below:
    
    ```
    $ cat test.scala
    spark.version
    case class Record(key: Int, value: String)
    spark.sparkContext.parallelize((1 to 2).map(i => Record(i, 
s"val_$i"))).toDF.show
    ```
    
    **Spark 2.3.2:**
    
    ```scala
    $ bin/spark-shell -i test.scala
    ...
    +---+-----+
    |key|value|
    +---+-----+
    |  1|val_1|
    |  2|val_2|
    +---+-----+
    ```
    
    **Before:**
    
    ```scala
    $ bin/spark-shell -i test.scala
    ...
    test.scala:17: error: value toDF is not a member of 
org.apache.spark.rdd.RDD[Record]
    Error occurred in an application involving default arguments.
           spark.sparkContext.parallelize((1 to 2).map(i => Record(i, 
s"val_$i"))).toDF.show
    ```
    
    **After:**
    
    ```scala
    ./bin/spark-shell -i test.scala
    ...
    +---+-----+
    |key|value|
    +---+-----+
    |  1|val_1|
    |  2|val_2|
    +---+-----+
    ```


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark SPARK-25906

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/22919.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #22919
    
----
commit 17b3c6c5ad4a39b1ecfd33111c392de47c6df967
Author: hyukjinkwon <gurwls223@...>
Date:   2018-11-01T08:53:39Z

    Revive -i option in spark-shell

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to